FailedConsole Output

Skipping 11,755 KB.. Full Log
-4315-95dc-535ab7d3cd23/srcpart specified for non-external table:srcpart
19:19:48.659 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/warehouse-52df4ab3-672a-4315-95dc-535ab7d3cd23/srcpart1 specified for non-external table:srcpart1
[info] - Partition pruning - with filter on string partition key - query test (1 second, 830 milliseconds)
[info] - Partition pruning - with filter on int partition key - pruning test (40 milliseconds)
19:19:49.783 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/warehouse-52df4ab3-672a-4315-95dc-535ab7d3cd23/src specified for non-external table:src
19:19:49.910 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/warehouse-52df4ab3-672a-4315-95dc-535ab7d3cd23/srcpart specified for non-external table:srcpart
19:19:51.526 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/warehouse-52df4ab3-672a-4315-95dc-535ab7d3cd23/srcpart1 specified for non-external table:srcpart1
[info] - Partition pruning - with filter on int partition key - query test (2 seconds, 837 milliseconds)
[info] - Partition pruning - left only 1 partition - pruning test (49 milliseconds)
19:19:52.708 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/warehouse-52df4ab3-672a-4315-95dc-535ab7d3cd23/src specified for non-external table:src
19:19:52.853 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/warehouse-52df4ab3-672a-4315-95dc-535ab7d3cd23/srcpart specified for non-external table:srcpart
19:19:53.644 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/warehouse-52df4ab3-672a-4315-95dc-535ab7d3cd23/srcpart1 specified for non-external table:srcpart1
[info] - Partition pruning - left only 1 partition - query test (1 second, 960 milliseconds)
[info] - Partition pruning - all partitions pruned - pruning test (41 milliseconds)
19:19:54.671 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/warehouse-52df4ab3-672a-4315-95dc-535ab7d3cd23/src specified for non-external table:src
19:19:54.791 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/warehouse-52df4ab3-672a-4315-95dc-535ab7d3cd23/srcpart specified for non-external table:srcpart
19:19:55.518 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/warehouse-52df4ab3-672a-4315-95dc-535ab7d3cd23/srcpart1 specified for non-external table:srcpart1
[info] - Partition pruning - all partitions pruned - query test (1 second, 816 milliseconds)
[info] - Partition pruning - pruning with both column key and partition key - pruning test (44 milliseconds)
19:19:56.469 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/warehouse-52df4ab3-672a-4315-95dc-535ab7d3cd23/src specified for non-external table:src
19:19:56.576 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/warehouse-52df4ab3-672a-4315-95dc-535ab7d3cd23/srcpart specified for non-external table:srcpart
19:19:57.304 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/warehouse-52df4ab3-672a-4315-95dc-535ab7d3cd23/srcpart1 specified for non-external table:srcpart1
[info] - Partition pruning - pruning with both column key and partition key - query test (1 second, 832 milliseconds)
[info] HiveUDFSuite:
19:19:58.505 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/warehouse-52df4ab3-672a-4315-95dc-535ab7d3cd23/src specified for non-external table:src
[info] - spark sql udf test that returns a struct (342 milliseconds)
[info] - SPARK-4785 When called with arguments referring column fields, PMOD throws NPE (219 milliseconds)
19:19:59.018 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/warehouse-52df4ab3-672a-4315-95dc-535ab7d3cd23/hiveudftesttable specified for non-external table:hiveudftesttable
[info] - hive struct udf (161 milliseconds)
[info] - Max/Min on named_struct (957 milliseconds)
[info] - SPARK-6409 UDAF Average test (374 milliseconds)
19:20:00.507 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/warehouse-52df4ab3-672a-4315-95dc-535ab7d3cd23/src specified for non-external table:src
[info] - SPARK-2693 udaf aggregates test (608 milliseconds)
[info] - SPARK-16228 Percentile needs explicit cast to double (12 milliseconds)
[info] - Generic UDAF aggregates (3 seconds, 370 milliseconds)
[info] - UDFIntegerToString (190 milliseconds)
[info] - UDFToListString (55 milliseconds)
[info] - UDFToListInt (52 milliseconds)
[info] - UDFToStringIntMap (78 milliseconds)
[info] - UDFToIntIntMap (101 milliseconds)
[info] - UDFListListInt (140 milliseconds)
[info] - UDFListString (115 milliseconds)
[info] - UDFStringString (160 milliseconds)
[info] - UDFTwoListList (171 milliseconds)
[info] - non-deterministic children of UDF (17 milliseconds)
[info] - non-deterministic children expressions of UDAF (21 milliseconds)
[info] - Hive UDFs with insufficient number of input arguments should trigger an analysis error (42 milliseconds)
[info] - Hive UDF in group by (193 milliseconds)
19:20:05.848 WARN org.apache.spark.sql.hive.HiveExternalCatalog: The table schema given by Hive metastore(struct<page_id:string,impressions:string>) is different from the schema when this table was created by Spark SQL(struct<page_id:int,impressions:int>). We have to fall back to the table schema from Hive metastore which is not case preserving.
19:20:05.970 WARN org.apache.spark.sql.hive.HiveExternalCatalog: The table schema given by Hive metastore(struct<page_id:string,impressions:string>) is different from the schema when this table was created by Spark SQL(struct<page_id:int,impressions:int>). We have to fall back to the table schema from Hive metastore which is not case preserving.
19:20:06.206 WARN org.apache.spark.sql.hive.HiveExternalCatalog: The table schema given by Hive metastore(struct<page_id:string,impressions:string>) is different from the schema when this table was created by Spark SQL(struct<page_id:int,impressions:int>). We have to fall back to the table schema from Hive metastore which is not case preserving.
19:20:06.210 WARN org.apache.spark.sql.hive.HiveExternalCatalog: The table schema given by Hive metastore(struct<page_id:string,impressions:string>) is different from the schema when this table was created by Spark SQL(struct<page_id:int,impressions:int>). We have to fall back to the table schema from Hive metastore which is not case preserving.
19:20:06.967 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/warehouse-52df4ab3-672a-4315-95dc-535ab7d3cd23/parquet_tmp specified for non-external table:parquet_tmp
[info] - SPARK-11522 select input_file_name from non-parquet table (1 second, 516 milliseconds)
[info] - Hive Stateful UDF (449 milliseconds)
[info] ScriptTransformationSuite:
[info] - cat without SerDe (246 milliseconds)
[info] - cat with LazySimpleSerDe (165 milliseconds)
19:20:09.454 ERROR org.apache.spark.executor.Executor: Exception in task 0.0 in stage 14593.0 (TID 35278)
java.lang.IllegalArgumentException: intentional exception
	at org.apache.spark.sql.hive.execution.ExceptionInjectingOperator$$anonfun$doExecute$1.apply(ScriptTransformationSuite.scala:136)
	at org.apache.spark.sql.hive.execution.ExceptionInjectingOperator$$anonfun$doExecute$1.apply(ScriptTransformationSuite.scala:133)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
	at scala.collection.Iterator$class.foreach(Iterator.scala:893)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread$$anonfun$run$1.apply$mcV$sp(ScriptTransformation.scala:289)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread$$anonfun$run$1.apply(ScriptTransformation.scala:278)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread$$anonfun$run$1.apply(ScriptTransformation.scala:278)
	at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1966)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread.run(ScriptTransformation.scala:278)
19:20:09.455 ERROR org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread: 
19:20:09.456 ERROR org.apache.spark.util.Utils: Uncaught exception in thread Thread-ScriptTransformation-Feed
java.lang.IllegalArgumentException: intentional exception
Exception in thread "Thread-ScriptTransformation-Feed" 	at org.apache.spark.sql.hive.execution.ExceptionInjectingOperator$$anonfun$doExecute$1.apply(ScriptTransformationSuite.scala:136)
java.lang.IllegalArgumentException: intentional exception
	at org.apache.spark.sql.hive.execution.ExceptionInjectingOperator$$anonfun$doExecute$1.apply(ScriptTransformationSuite.scala:136)
	at org.apache.spark.sql.hive.execution.ExceptionInjectingOperator$$anonfun$doExecute$1.apply(ScriptTransformationSuite.scala:133)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
	at scala.collection.Iterator$class.foreach(Iterator.scala:893)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread$$anonfun$run$1.apply$mcV$sp(ScriptTransformation.scala:289)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread$$anonfun$run$1.apply(ScriptTransformation.scala:278)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread$$anonfun$run$1.apply(ScriptTransformation.scala:278)
	at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1966)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread.run(ScriptTransformation.scala:278)
	at org.apache.spark.sql.hive.execution.ExceptionInjectingOperator$$anonfun$doExecute$1.apply(ScriptTransformationSuite.scala:133)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
	at scala.collection.Iterator$class.foreach(Iterator.scala:893)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread$$anonfun$run$1.apply$mcV$sp(ScriptTransformation.scala:289)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread$$anonfun$run$1.apply(ScriptTransformation.scala:278)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread$$anonfun$run$1.apply(ScriptTransformation.scala:278)
	at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1966)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread.run(ScriptTransformation.scala:278)
19:20:09.457 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 0.0 in stage 14593.0 (TID 35278, localhost, executor driver): java.lang.IllegalArgumentException: intentional exception
	at org.apache.spark.sql.hive.execution.ExceptionInjectingOperator$$anonfun$doExecute$1.apply(ScriptTransformationSuite.scala:136)
	at org.apache.spark.sql.hive.execution.ExceptionInjectingOperator$$anonfun$doExecute$1.apply(ScriptTransformationSuite.scala:133)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
	at scala.collection.Iterator$class.foreach(Iterator.scala:893)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread$$anonfun$run$1.apply$mcV$sp(ScriptTransformation.scala:289)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread$$anonfun$run$1.apply(ScriptTransformation.scala:278)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread$$anonfun$run$1.apply(ScriptTransformation.scala:278)
	at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1966)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread.run(ScriptTransformation.scala:278)

19:20:09.458 ERROR org.apache.spark.scheduler.TaskSetManager: Task 0 in stage 14593.0 failed 1 times; aborting job
[info] - script transformation should not swallow errors from upstream operators (no serde) (1 second, 196 milliseconds)
19:20:10.641 ERROR org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread: 
Exception in thread "Thread-ScriptTransformation-Feed" java.lang.IllegalArgumentException: intentional exception
	at org.apache.spark.sql.hive.execution.ExceptionInjectingOperator$$anonfun$doExecute$1.apply(ScriptTransformationSuite.scala:136)
	at org.apache.spark.sql.hive.execution.ExceptionInjectingOperator$$anonfun$doExecute$1.apply(ScriptTransformationSuite.scala:133)
19:20:10.641 ERROR org.apache.spark.util.Utils: Uncaught exception in thread Thread-ScriptTransformation-Feed
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
	at scala.collection.Iterator$class.foreach(Iterator.scala:893)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread$$anonfun$run$1.apply$mcV$sp(ScriptTransformation.scala:289)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread$$anonfun$run$1.apply(ScriptTransformation.scala:278)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread$$anonfun$run$1.apply(ScriptTransformation.scala:278)
	at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1966)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread.run(ScriptTransformation.scala:278)
java.lang.IllegalArgumentException: intentional exception
	at org.apache.spark.sql.hive.execution.ExceptionInjectingOperator$$anonfun$doExecute$1.apply(ScriptTransformationSuite.scala:136)
	at org.apache.spark.sql.hive.execution.ExceptionInjectingOperator$$anonfun$doExecute$1.apply(ScriptTransformationSuite.scala:133)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
	at scala.collection.Iterator$class.foreach(Iterator.scala:893)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread$$anonfun$run$1.apply$mcV$sp(ScriptTransformation.scala:289)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread$$anonfun$run$1.apply(ScriptTransformation.scala:278)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread$$anonfun$run$1.apply(ScriptTransformation.scala:278)
	at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1966)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread.run(ScriptTransformation.scala:278)
19:20:10.641 ERROR org.apache.spark.executor.Executor: Exception in task 0.0 in stage 14594.0 (TID 35279)
java.lang.IllegalArgumentException: intentional exception
	at org.apache.spark.sql.hive.execution.ExceptionInjectingOperator$$anonfun$doExecute$1.apply(ScriptTransformationSuite.scala:136)
	at org.apache.spark.sql.hive.execution.ExceptionInjectingOperator$$anonfun$doExecute$1.apply(ScriptTransformationSuite.scala:133)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
	at scala.collection.Iterator$class.foreach(Iterator.scala:893)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread$$anonfun$run$1.apply$mcV$sp(ScriptTransformation.scala:289)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread$$anonfun$run$1.apply(ScriptTransformation.scala:278)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread$$anonfun$run$1.apply(ScriptTransformation.scala:278)
	at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1966)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread.run(ScriptTransformation.scala:278)
19:20:10.643 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 0.0 in stage 14594.0 (TID 35279, localhost, executor driver): java.lang.IllegalArgumentException: intentional exception
	at org.apache.spark.sql.hive.execution.ExceptionInjectingOperator$$anonfun$doExecute$1.apply(ScriptTransformationSuite.scala:136)
	at org.apache.spark.sql.hive.execution.ExceptionInjectingOperator$$anonfun$doExecute$1.apply(ScriptTransformationSuite.scala:133)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
	at scala.collection.Iterator$class.foreach(Iterator.scala:893)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread$$anonfun$run$1.apply$mcV$sp(ScriptTransformation.scala:289)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread$$anonfun$run$1.apply(ScriptTransformation.scala:278)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread$$anonfun$run$1.apply(ScriptTransformation.scala:278)
	at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1966)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread.run(ScriptTransformation.scala:278)

19:20:10.643 ERROR org.apache.spark.scheduler.TaskSetManager: Task 0 in stage 14594.0 failed 1 times; aborting job
[info] - script transformation should not swallow errors from upstream operators (with serde) (1 second, 184 milliseconds)
19:20:10.744 ERROR org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread: /bin/bash: some_non_existent_command: command not found

19:20:10.745 ERROR org.apache.spark.sql.hive.execution.ScriptTransformation: /bin/bash: some_non_existent_command: command not found

19:20:10.746 ERROR org.apache.spark.sql.hive.execution.ScriptTransformation: /bin/bash: some_non_existent_command: command not found

19:20:10.746 ERROR org.apache.spark.executor.Executor: Exception in task 0.0 in stage 14595.0 (TID 35280)
org.apache.spark.SparkException: Subprocess exited with status 127. Error: /bin/bash: some_non_existent_command: command not found

	at org.apache.spark.sql.hive.execution.ScriptTransformation$$anon$1.checkFailureAndPropagate(ScriptTransformation.scala:149)
	at org.apache.spark.sql.hive.execution.ScriptTransformation$$anon$1.hasNext(ScriptTransformation.scala:197)
	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$2.apply(SparkPlan.scala:231)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$2.apply(SparkPlan.scala:225)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:829)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:829)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:100)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:325)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.spark.SparkException: Subprocess exited with status 127. Error: /bin/bash: some_non_existent_command: command not found

	at org.apache.spark.sql.hive.execution.ScriptTransformation$$anon$1.checkFailureAndPropagate(ScriptTransformation.scala:149)
	at org.apache.spark.sql.hive.execution.ScriptTransformation$$anon$1.hasNext(ScriptTransformation.scala:186)
	... 14 more
19:20:10.747 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 0.0 in stage 14595.0 (TID 35280, localhost, executor driver): org.apache.spark.SparkException: Subprocess exited with status 127. Error: /bin/bash: some_non_existent_command: command not found

	at org.apache.spark.sql.hive.execution.ScriptTransformation$$anon$1.checkFailureAndPropagate(ScriptTransformation.scala:149)
	at org.apache.spark.sql.hive.execution.ScriptTransformation$$anon$1.hasNext(ScriptTransformation.scala:197)
	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$2.apply(SparkPlan.scala:231)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$2.apply(SparkPlan.scala:225)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:829)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:829)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:100)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:325)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.spark.SparkException: Subprocess exited with status 127. Error: /bin/bash: some_non_existent_command: command not found

	at org.apache.spark.sql.hive.execution.ScriptTransformation$$anon$1.checkFailureAndPropagate(ScriptTransformation.scala:149)
	at org.apache.spark.sql.hive.execution.ScriptTransformation$$anon$1.hasNext(ScriptTransformation.scala:186)
	... 14 more

19:20:10.747 ERROR org.apache.spark.scheduler.TaskSetManager: Task 0 in stage 14595.0 failed 1 times; aborting job
[info] - SPARK-14400 script transformation should fail for bad script command (103 milliseconds)
19:20:17.355 WARN org.apache.hadoop.hive.metastore.ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
19:20:17.527 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database default, returning NoSuchObjectException
[info] HiveExternalCatalogSuite:
19:20:18.148 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database testing, returning NoSuchObjectException
19:20:18.149 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database testing2, returning NoSuchObjectException
19:20:18.150 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database testing, returning NoSuchObjectException
19:20:18.169 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database testing2, returning NoSuchObjectException
19:20:18.175 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database does_not_exist, returning NoSuchObjectException
[info] - basic create and list databases (57 milliseconds)
19:20:18.344 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:18.350 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - get database when a database exists (933 milliseconds)
19:20:20.323 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:20.339 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
19:20:20.597 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db_that_does_not_exist, returning NoSuchObjectException
[info] - get database should throw exception when the database does not exist (279 milliseconds)
19:20:20.818 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:20.821 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - list databases without pattern (214 milliseconds)
19:20:21.345 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:21.349 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - list databases with pattern (455 milliseconds)
19:20:21.957 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:21.960 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - drop database (434 milliseconds)
19:20:22.516 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:22.518 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
19:20:22.991 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:22.997 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
19:20:23.364 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:23.366 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - drop database when the database is not empty (1 second, 283 milliseconds)
19:20:23.817 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:23.819 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
19:20:24.122 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db_that_does_not_exist, returning NoSuchObjectException
19:20:24.123 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db_that_does_not_exist, returning NoSuchObjectException
[info] - drop database when the database does not exist (309 milliseconds)
19:20:24.394 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:24.397 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - alter database (214 milliseconds)
19:20:24.753 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:24.758 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
19:20:25.004 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database does_not_exist, returning NoSuchObjectException
[info] - alter database should throw exception when the database does not exist (255 milliseconds)
19:20:25.207 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:25.209 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - the table type of an external table should be EXTERNAL_TABLE (242 milliseconds)
19:20:25.692 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:25.699 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - create table when the table already exists (275 milliseconds)
19:20:26.097 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:26.100 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - drop table (208 milliseconds)
19:20:26.431 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:26.438 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
19:20:26.684 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database unknown_db, returning NoSuchObjectException
19:20:26.686 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database unknown_db, returning NoSuchObjectException
[info] - drop table when database/table does not exist (263 milliseconds)
19:20:26.869 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:26.871 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - rename table (270 milliseconds)
19:20:27.378 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:27.384 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - rename table when database/table does not exist (190 milliseconds)
19:20:27.700 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:27.702 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - rename table when destination table already exists (214 milliseconds)
19:20:28.100 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:28.103 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - alter table (331 milliseconds)
19:20:28.538 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:28.542 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - alter table when database/table does not exist (221 milliseconds)
19:20:28.916 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:28.918 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - alter table schema (234 milliseconds)
19:20:29.267 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:29.270 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - get table (186 milliseconds)
19:20:29.550 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:29.553 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - get table when database/table does not exist (183 milliseconds)
19:20:29.891 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:29.892 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
19:20:30.105 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database unknown_db, returning NoSuchObjectException
[info] - list tables without pattern (220 milliseconds)
19:20:30.241 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:30.244 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
19:20:30.457 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database unknown_db, returning NoSuchObjectException
[info] - list tables with pattern (242 milliseconds)
19:20:30.598 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:30.603 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
19:20:30.814 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/spark-2af65c6a-ed2b-4377-8d9b-43bbc80e6aef/tbl specified for non-external table:tbl
[info] - column names should be case-preserving and column nullability should be retained (263 milliseconds)
19:20:31.089 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database mydb, returning NoSuchObjectException
[info] - basic create and list partitions (214 milliseconds)
19:20:31.393 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:31.398 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - create partitions when database/table does not exist (199 milliseconds)
19:20:31.687 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:31.688 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - create partitions that already exist (240 milliseconds)
19:20:32.049 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:32.051 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
19:20:32.213 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/spark-323d74ab-356d-404f-969e-45c4659057c3/tbl specified for non-external table:tbl
[info] - create partitions without location (326 milliseconds)
19:20:32.616 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:32.620 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
19:20:32.808 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/spark-37b4090e-0036-47ea-b8f4-a4ff3db7c237/tbl specified for non-external table:tbl
[info] - create/drop partitions in managed tables with location (698 milliseconds)
19:20:33.467 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:33.471 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - list partition names (312 milliseconds)
19:20:33.988 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:33.989 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - list partition names with partial partition spec (294 milliseconds)
19:20:34.416 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:34.427 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - list partitions with partial partition spec (272 milliseconds)
19:20:34.827 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:34.832 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
19:20:35.416 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:35.420 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - drop partitions (1 second, 153 milliseconds)
19:20:36.161 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:36.166 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - drop partitions when database/table does not exist (195 milliseconds)
19:20:36.486 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:36.487 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - drop partitions that do not exist (197 milliseconds)
19:20:36.821 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:36.823 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - get partition (217 milliseconds)
19:20:37.130 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:37.134 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - get partition when database/table does not exist (194 milliseconds)
19:20:37.436 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:37.438 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - rename partitions (333 milliseconds)
19:20:37.902 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:37.903 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
19:20:38.108 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/spark-e0c7552f-5a83-4d5a-8fd2-1f42346c3188/tbl specified for non-external table:tbl
[info] - rename partitions should update the location for managed table (627 milliseconds)
19:20:38.802 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:38.806 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - rename partitions when database/table does not exist (178 milliseconds)
19:20:39.066 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:39.067 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - rename partitions when the new partition already exists (187 milliseconds)
19:20:39.383 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:39.388 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - alter partitions (431 milliseconds)
19:20:39.921 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:39.924 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
19:20:40.102 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database does_not_exist, returning NoSuchObjectException
[info] - alter partitions when database/table does not exist (185 milliseconds)
19:20:40.197 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database mydb, returning NoSuchObjectException
[info] - basic create and list functions (7 milliseconds)
19:20:40.239 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:40.241 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
19:20:40.378 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database does_not_exist, returning NoSuchObjectException
[info] - create function when database does not exist (170 milliseconds)
19:20:40.507 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:40.509 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - create function that already exists (155 milliseconds)
19:20:40.743 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:40.744 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - drop function (136 milliseconds)
19:20:40.966 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:40.967 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
19:20:41.155 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database does_not_exist, returning NoSuchObjectException
[info] - drop function when database does not exist (192 milliseconds)
19:20:41.261 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:41.266 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - drop function that does not exist (200 milliseconds)
19:20:41.582 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:41.584 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - get function (149 milliseconds)
19:20:41.860 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:41.862 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
19:20:42.021 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database does_not_exist, returning NoSuchObjectException
[info] - get function when database does not exist (162 milliseconds)
19:20:42.097 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:42.099 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - rename function (188 milliseconds)
19:20:42.369 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:42.374 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
19:20:42.544 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database does_not_exist, returning NoSuchObjectException
[info] - rename function when database does not exist (178 milliseconds)
19:20:42.679 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:42.680 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - rename function when new function already exists (209 milliseconds)
19:20:43.004 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:43.012 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - list functions (244 milliseconds)
19:20:43.387 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:43.391 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
19:20:43.549 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database mydb, returning NoSuchObjectException
[info] - create/drop database should create/delete the directory (210 milliseconds)
19:20:43.721 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:43.723 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
19:20:43.941 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/spark-cbdeb5e6-e4d8-408b-985a-b3a64305c6c8/my_table specified for non-external table:my_table
[info] - create/drop/rename table should create/delete/rename the directory (545 milliseconds)
19:20:44.415 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:44.420 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
19:20:44.628 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/spark-74454e5a-c762-462a-b272-d5e136ff21ea/tbl specified for non-external table:tbl
[info] - create/drop/rename partitions should create/delete/rename the directory (1 second, 162 milliseconds)
19:20:45.954 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:45.960 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - drop partition from external table should not delete the directory (458 milliseconds)
19:20:46.550 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:46.555 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
[info] - list partitions by filter (318 milliseconds)
19:20:47.052 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db1, returning NoSuchObjectException
19:20:47.058 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db2, returning NoSuchObjectException
19:20:47.271 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/spark-0a9d9350-b0a1-4789-9eba-a2325f7f4205/hive_tbl specified for non-external table:hive_tbl
[info] - SPARK-18647: do not put provider in table properties for Hive serde table (274 milliseconds)
[info] SQLViewSuite:
19:20:47.588 WARN org.apache.spark.sql.hive.HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data source provider json. Persisting data source table `default`.`jt` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
[info] - create a permanent view on a permanent view (329 milliseconds)
[info] - create a temp view on a permanent view (178 milliseconds)
[info] - create a temp view on a temp view (93 milliseconds)
[info] - create a permanent view on a temp view (26 milliseconds)
19:20:48.226 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/warehouse-52df4ab3-672a-4315-95dc-535ab7d3cd23/tab1 specified for non-external table:tab1
[info] - error handling: existing a table with the duplicate name when creating/altering a view (91 milliseconds)
19:20:48.319 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/warehouse-52df4ab3-672a-4315-95dc-535ab7d3cd23/tab1 specified for non-external table:tab1
[info] - existing a table with the duplicate name when CREATE VIEW IF NOT EXISTS (167 milliseconds)
[info] - Issue exceptions for ALTER VIEW on the temporary view (87 milliseconds)
[info] - Issue exceptions for ALTER TABLE on the temporary view (64 milliseconds)
[info] - Issue exceptions for other table DDL on the temporary view (15 milliseconds)
[info] - error handling: insert/load/truncate table commands against a view (67 milliseconds)
[info] - error handling: fail if the view sql itself is invalid (7 milliseconds)
[info] - error handling: fail if the temp view name contains the database prefix (1 millisecond)
[info] - error handling: disallow IF NOT EXISTS for CREATE TEMPORARY VIEW (1 millisecond)
[info] - error handling: fail if the temp view sql itself is invalid (6 milliseconds)
[info] - correctly parse CREATE VIEW statement (375 milliseconds)
[info] - correctly parse CREATE TEMPORARY VIEW statement (292 milliseconds)
[info] - should NOT allow CREATE TEMPORARY VIEW when TEMPORARY VIEW with same name exists (14 milliseconds)
[info] - should allow CREATE TEMPORARY VIEW when a permanent VIEW with same name exists (55 milliseconds)
[info] - should allow CREATE permanent VIEW when a TEMPORARY VIEW with same name exists (58 milliseconds)
19:20:49.722 WARN org.apache.spark.sql.hive.HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data source provider json. Persisting data source table `default`.`jt2` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
[info] - correctly handle CREATE VIEW IF NOT EXISTS (574 milliseconds)
[info] - correctly handle CREATE OR REPLACE TEMPORARY VIEW (578 milliseconds)
19:20:51.267 WARN org.apache.spark.sql.hive.HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data source provider json. Persisting data source table `default`.`jt2` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
[info] - correctly handle CREATE OR REPLACE VIEW (1 second, 34 milliseconds)
19:20:51.920 WARN org.apache.spark.sql.hive.HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data source provider json. Persisting data source table `default`.`jt2` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
[info] - correctly handle ALTER VIEW (620 milliseconds)
[info] - should not allow ALTER VIEW AS when the view does not exist (4 milliseconds)
[info] - ALTER VIEW AS should try to alter temp view first if view name has no database part (163 milliseconds)
[info] - ALTER VIEW AS should alter permanent view if view name has database part (141 milliseconds)
[info] - ALTER VIEW AS should keep the previous table properties, comment, create_time, etc. (115 milliseconds)
[info] - create hive view for json table (364 milliseconds)
[info] - create hive view for partitioned parquet table (625 milliseconds)
[info] - CTE within view (107 milliseconds)
19:20:53.861 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/warehouse-52df4ab3-672a-4315-95dc-535ab7d3cd23/src specified for non-external table:src
19:20:54.005 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db_917b2caf_7833_478c_9d8e_295eff41f93d, returning NoSuchObjectException
19:20:54.220 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/warehouse-52df4ab3-672a-4315-95dc-535ab7d3cd23/db_917b2caf_7833_478c_9d8e_295eff41f93d.db/src specified for non-external table:src
[info] - Using view after switching current database (628 milliseconds)
[info] - Using view after adding more columns (632 milliseconds)
19:20:55.233 WARN org.apache.spark.sql.hive.HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data source provider json. Persisting data source table `default`.`jt1` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
19:20:55.348 WARN org.apache.spark.sql.hive.HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data source provider json. Persisting data source table `default`.`jt2` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
19:20:56.026 WARN org.apache.spark.sql.hive.HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data source provider json. Persisting data source table `default`.`jt1` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
[info] - create hive view for joined tables (1 second, 506 milliseconds)
19:20:56.629 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/warehouse-52df4ab3-672a-4315-95dc-535ab7d3cd23/t_part specified for non-external table:t_part
[info] - SPARK-14933 - create view from hive parquet table (436 milliseconds)
19:20:57.064 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/warehouse-52df4ab3-672a-4315-95dc-535ab7d3cd23/t_orc specified for non-external table:t_orc
[info] - SPARK-14933 - create view from hive orc table (1 second, 100 milliseconds)
[info] - create a permanent/temp view using a hive, built-in, and permanent user function (854 milliseconds)
19:20:59.322 WARN org.apache.spark.sql.execution.datasources.PartitioningAwareFileIndex: The directory file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/warehouse-52df4ab3-672a-4315-95dc-535ab7d3cd23/tab1 was not found. Was it deleted very recently?
[info] - create a permanent/temp view using a temporary function (313 milliseconds)
[info] HiveSchemaInferenceSuite:
[info] - orc: schema should be inferred and saved when INFER_AND_SAVE is specified (1 second, 167 milliseconds)
[info] - parquet: schema should be inferred and saved when INFER_AND_SAVE is specified (1 second, 224 milliseconds)
[info] - orc: schema should be inferred but not stored when INFER_ONLY is specified (1 second, 26 milliseconds)
[info] - parquet: schema should be inferred but not stored when INFER_ONLY is specified (1 second, 669 milliseconds)
[info] - orc: schema should not be inferred when NEVER_INFER is specified (865 milliseconds)
[info] - parquet: schema should not be inferred when NEVER_INFER is specified (884 milliseconds)
[info] - mergeWithMetastoreSchema() should return expected results (4 milliseconds)
[info] Test run started
[info] Test org.apache.spark.sql.hive.JavaDataFrameSuite.testUDAF started
[info] Test org.apache.spark.sql.hive.JavaDataFrameSuite.saveTableAndQueryIt started
[info] Test run finished: 0 failed, 0 ignored, 2 total, 1.497s
[info] Test run started
[info] Test org.apache.spark.sql.hive.JavaMetastoreDataSourcesSuite.saveExternalTableAndQueryIt started
19:21:07.953 WARN org.apache.spark.sql.execution.datasources.PartitioningAwareFileIndex: The directory file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/datasource-87e27ec8-9c01-4e09-a1ea-7acc92722895 was not found. Was it deleted very recently?
19:21:08.064 WARN org.apache.spark.sql.hive.HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data source provider org.apache.spark.sql.json. Persisting data source table `default`.`javasavedtable` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
19:21:08.304 WARN org.apache.spark.sql.hive.HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data source provider org.apache.spark.sql.json. Persisting data source table `default`.`externaltable` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
[info] Test org.apache.spark.sql.hive.JavaMetastoreDataSourcesSuite.saveTableAndQueryIt started
19:21:08.594 WARN org.apache.spark.sql.execution.datasources.PartitioningAwareFileIndex: The directory file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/warehouse-52df4ab3-672a-4315-95dc-535ab7d3cd23/javasavedtable was not found. Was it deleted very recently?
19:21:08.719 WARN org.apache.spark.sql.hive.HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data source provider org.apache.spark.sql.json. Persisting data source table `default`.`javasavedtable` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
[info] Test org.apache.spark.sql.hive.JavaMetastoreDataSourcesSuite.saveExternalTableWithSchemaAndQueryIt started
19:21:08.849 WARN org.apache.spark.sql.execution.datasources.PartitioningAwareFileIndex: The directory file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/target/tmp/datasource-205e2872-eec6-41a8-9d31-f71e194b08ea was not found. Was it deleted very recently?
19:21:08.960 WARN org.apache.spark.sql.hive.HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data source provider org.apache.spark.sql.json. Persisting data source table `default`.`javasavedtable` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
19:21:09.120 WARN org.apache.spark.sql.hive.HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data source provider org.apache.spark.sql.json. Persisting data source table `default`.`externaltable` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
[info] Test run finished: 0 failed, 0 ignored, 3 total, 1.516s
[info] ScalaCheck
[info] Passed: Total 0, Failed 0, Errors 0, Passed 0
[info] Warning: Unknown ScalaCheck args provided: -oDF
[info] ScalaTest
[info] Run completed in 2 hours, 7 minutes, 33 seconds.
[info] Total number of tests run: 2311
[info] Suites: completed 72, aborted 0
[info] Tests: succeeded 2311, failed 0, canceled 0, ignored 593, pending 0
[info] All tests passed.
[info] Passed: Total 2316, Failed 0, Errors 0, Passed 2316, Ignored 593
[success] Total time: 8577 s, completed Aug 27, 2018 7:21:17 PM

========================================================================
Running PySpark tests
========================================================================
Running PySpark tests. Output is in /home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/python/unit-tests.log
Will test against the following Python executables: ['python2.6', 'python3.4', 'pypy']
Will test the following Python modules: ['pyspark-core', 'pyspark-ml', 'pyspark-mllib', 'pyspark-sql', 'pyspark-streaming']
Starting test(pypy): pyspark.streaming.tests
Starting test(pypy): pyspark.tests
Starting test(python2.6): pyspark.mllib.tests
Starting test(pypy): pyspark.sql.tests
Finished test(pypy): pyspark.tests (213s)
Starting test(python2.6): pyspark.sql.tests
Finished test(python2.6): pyspark.mllib.tests (243s)
Starting test(python2.6): pyspark.streaming.tests
Running tests: [<class 'pyspark.streaming.tests.BasicOperationTests'>, <class 'pyspark.streaming.tests.WindowFunctionTests'>, <class 'pyspark.streaming.tests.StreamingContextTests'>, <class 'pyspark.streaming.tests.CheckpointTests'>, <class 'pyspark.streaming.tests.KafkaStreamTests'>, <class 'pyspark.streaming.tests.FlumeStreamTests'>, <class 'pyspark.streaming.tests.FlumePollingStreamTests'>, <class 'pyspark.streaming.tests.StreamingListenerTests'>, <class 'pyspark.streaming.tests.KinesisStreamTests'>] 
[Running <class 'pyspark.streaming.tests.BasicOperationTests'>]
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
test_cogroup (pyspark.streaming.tests.BasicOperationTests) ... 
[Stage 0:>                                                          (0 + 0) / 2]
[Stage 0:>                                                          (0 + 2) / 2]
                                                                                
ok
test_combineByKey (pyspark.streaming.tests.BasicOperationTests)
Basic operation test for DStream.combineByKey. ... ok
test_count (pyspark.streaming.tests.BasicOperationTests)
Basic operation test for DStream.count. ... ok
test_countByValue (pyspark.streaming.tests.BasicOperationTests)
Basic operation test for DStream.countByValue. ... ok
test_failed_func (pyspark.streaming.tests.BasicOperationTests) ... ok
test_failed_func2 (pyspark.streaming.tests.BasicOperationTests) ... ok
test_failed_func_with_reseting_failure (pyspark.streaming.tests.BasicOperationTests) ... ok
test_filter (pyspark.streaming.tests.BasicOperationTests)
Basic operation test for DStream.filter. ... ok
test_flatMap (pyspark.streaming.tests.BasicOperationTests)
Basic operation test for DStream.faltMap. ... ok
test_flatMapValues (pyspark.streaming.tests.BasicOperationTests)
Basic operation test for DStream.flatMapValues. ... ok
test_full_outer_join (pyspark.streaming.tests.BasicOperationTests) ... ok
test_glom (pyspark.streaming.tests.BasicOperationTests)
Basic operation test for DStream.glom. ... ok
test_groupByKey (pyspark.streaming.tests.BasicOperationTests)
Basic operation test for DStream.groupByKey. ... ok
test_join (pyspark.streaming.tests.BasicOperationTests) ... ok
test_left_outer_join (pyspark.streaming.tests.BasicOperationTests) ... ok
test_map (pyspark.streaming.tests.BasicOperationTests)
Basic operation test for DStream.map. ... ok
test_mapPartitions (pyspark.streaming.tests.BasicOperationTests)
Basic operation test for DStream.mapPartitions. ... ok
test_mapValues (pyspark.streaming.tests.BasicOperationTests)
Basic operation test for DStream.mapValues. ... ok
test_reduce (pyspark.streaming.tests.BasicOperationTests)
Basic operation test for DStream.reduce. ... ok
test_reduceByKey (pyspark.streaming.tests.BasicOperationTests)
Basic operation test for DStream.reduceByKey. ... ok
test_repartition (pyspark.streaming.tests.BasicOperationTests) ... ok
test_right_outer_join (pyspark.streaming.tests.BasicOperationTests) ... ok
test_union (pyspark.streaming.tests.BasicOperationTests) ... ok
test_update_state_by_key (pyspark.streaming.tests.BasicOperationTests) ... ok
test_update_state_by_key_initial_rdd (pyspark.streaming.tests.BasicOperationTests) ... ok

----------------------------------------------------------------------
Ran 25 tests in 48.188s

OK
[Running <class 'pyspark.streaming.tests.WindowFunctionTests'>]
test_count_by_value_and_window (pyspark.streaming.tests.WindowFunctionTests) ... ok
test_count_by_window (pyspark.streaming.tests.WindowFunctionTests) ... ok
test_count_by_window_large (pyspark.streaming.tests.WindowFunctionTests) ... FAIL
test_group_by_key_and_window (pyspark.streaming.tests.WindowFunctionTests) ... ok
test_reduce_by_invalid_window (pyspark.streaming.tests.WindowFunctionTests) ... ok
test_reduce_by_key_and_window_with_none_invFunc (pyspark.streaming.tests.WindowFunctionTests) ... ok
test_window (pyspark.streaming.tests.WindowFunctionTests) ... ok

======================================================================
FAIL: test_count_by_window_large (pyspark.streaming.tests.WindowFunctionTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/python/pyspark/streaming/tests.py", line 650, in test_count_by_window_large
    self._test_func(input, func, expected)
  File "/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/python/pyspark/streaming/tests.py", line 162, in _test_func
    self.assertEqual(expected, result)
AssertionError: Lists differ: [[1], [3], [6], [10], [15], [2... != [[1], [3], [6], [10], [15], [2...

First list contains 1 additional elements.
First extra element 9:
[6]

- [[1], [3], [6], [10], [15], [20], [18], [15], [11], [6]]
?                                                   ---- -

+ [[1], [3], [6], [10], [15], [20], [18], [15], [11]]

----------------------------------------------------------------------
Ran 7 tests in 43.995s

FAILED (failures=1)
[Running <class 'pyspark.streaming.tests.StreamingContextTests'>]
test_await_termination_or_timeout (pyspark.streaming.tests.StreamingContextTests) ... ok
test_binary_records_stream (pyspark.streaming.tests.StreamingContextTests) ... ok
test_get_active (pyspark.streaming.tests.StreamingContextTests) ... ok
test_get_active_or_create (pyspark.streaming.tests.StreamingContextTests) ... ok
test_queue_stream (pyspark.streaming.tests.StreamingContextTests) ... ok
test_stop_multiple_times (pyspark.streaming.tests.StreamingContextTests) ... ok
test_stop_only_streaming_context (pyspark.streaming.tests.StreamingContextTests) ... ok
test_text_file_stream (pyspark.streaming.tests.StreamingContextTests) ... ok
test_transform (pyspark.streaming.tests.StreamingContextTests) ... ok
test_union (pyspark.streaming.tests.StreamingContextTests) ... ok

----------------------------------------------------------------------
Ran 10 tests in 11.197s

OK
[Running <class 'pyspark.streaming.tests.CheckpointTests'>]
test_get_or_create_and_get_active_or_create (pyspark.streaming.tests.CheckpointTests) ... ok
test_transform_function_serializer_failure (pyspark.streaming.tests.CheckpointTests) ... Traceback (most recent call last):
  File "/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/python/pyspark/cloudpickle.py", line 147, in dump
    return Pickler.dump(self, obj)
  File "/usr/lib64/pypy-2.5.1/lib-python/2.7/pickle.py", line 224, in dump
    self.save(obj)
  File "/usr/lib64/pypy-2.5.1/lib-python/2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/usr/lib64/pypy-2.5.1/lib-python/2.7/pickle.py", line 548, in save_tuple
    save(element)
  File "/usr/lib64/pypy-2.5.1/lib-python/2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/python/pyspark/cloudpickle.py", line 254, in save_function
    self.save_function_tuple(obj)
  File "/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/python/pyspark/cloudpickle.py", line 291, in save_function_tuple
    save((code, closure, base_globals))
  File "/usr/lib64/pypy-2.5.1/lib-python/2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/usr/lib64/pypy-2.5.1/lib-python/2.7/pickle.py", line 548, in save_tuple
    save(element)
  File "/usr/lib64/pypy-2.5.1/lib-python/2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/usr/lib64/pypy-2.5.1/lib-python/2.7/pickle.py", line 600, in save_list
    self._batch_appends(iter(obj))
  File "/usr/lib64/pypy-2.5.1/lib-python/2.7/pickle.py", line 636, in _batch_appends
    save(tmp[0])
  File "/usr/lib64/pypy-2.5.1/lib-python/2.7/pickle.py", line 306, in save
    rv = reduce(self.proto)
  File "/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/python/pyspark/context.py", line 283, in __getnewargs__
    "It appears that you are attempting to reference SparkContext from a broadcast "
Exception: It appears that you are attempting to reference SparkContext from a broadcast variable, action, or transformation. SparkContext can only be used on the driver, not in code that it run on workers. For more information, see SPARK-5063.
ok

----------------------------------------------------------------------
Ran 2 tests in 42.863s

OK
[Running <class 'pyspark.streaming.tests.KafkaStreamTests'>]
test_kafka_direct_stream (pyspark.streaming.tests.KafkaStreamTests)
Test the Python direct Kafka stream API. ... ok
test_kafka_direct_stream_foreach_get_offsetRanges (pyspark.streaming.tests.KafkaStreamTests)
Test the Python direct Kafka stream foreachRDD get offsetRanges. ... ok
test_kafka_direct_stream_from_offset (pyspark.streaming.tests.KafkaStreamTests)
Test the Python direct Kafka stream API with start offset specified. ... ok
test_kafka_direct_stream_message_handler (pyspark.streaming.tests.KafkaStreamTests)
Test the Python direct Kafka stream MessageHandler. ... ok
test_kafka_direct_stream_transform_get_offsetRanges (pyspark.streaming.tests.KafkaStreamTests)
Test the Python direct Kafka stream transform get offsetRanges. ... ok
test_kafka_direct_stream_transform_with_checkpoint (pyspark.streaming.tests.KafkaStreamTests)
Test the Python direct Kafka stream transform with checkpoint correctly recovered. ... ok
test_kafka_rdd (pyspark.streaming.tests.KafkaStreamTests)
Test the Python direct Kafka RDD API. ... ok
test_kafka_rdd_get_offsetRanges (pyspark.streaming.tests.KafkaStreamTests)
Test Python direct Kafka RDD get OffsetRanges. ... ok
test_kafka_rdd_message_handler (pyspark.streaming.tests.KafkaStreamTests)
Test Python direct Kafka RDD MessageHandler. ... ok
test_kafka_rdd_with_leaders (pyspark.streaming.tests.KafkaStreamTests)
Test the Python direct Kafka RDD API with leaders. ... ok
test_kafka_stream (pyspark.streaming.tests.KafkaStreamTests)
Test the Python Kafka stream API. ... ok
test_topic_and_partition_equality (pyspark.streaming.tests.KafkaStreamTests) ... ok

----------------------------------------------------------------------
Ran 12 tests in 118.366s

OK
[Running <class 'pyspark.streaming.tests.FlumeStreamTests'>]
test_compressed_flume_stream (pyspark.streaming.tests.FlumeStreamTests) ... ok
test_flume_stream (pyspark.streaming.tests.FlumeStreamTests) ... ok

----------------------------------------------------------------------
Ran 2 tests in 5.954s

OK
[Running <class 'pyspark.streaming.tests.FlumePollingStreamTests'>]
test_flume_polling (pyspark.streaming.tests.FlumePollingStreamTests) ... 
[Stage 1:>                                                          (0 + 3) / 3]
                                                                                
ok
test_flume_polling_multiple_hosts (pyspark.streaming.tests.FlumePollingStreamTests) ... ok

----------------------------------------------------------------------
Ran 2 tests in 33.113s

OK
[Running <class 'pyspark.streaming.tests.StreamingListenerTests'>]
test_batch_info_reports (pyspark.streaming.tests.StreamingListenerTests) ... ok

----------------------------------------------------------------------
Ran 1 test in 3.257s

OK
[Running <class 'pyspark.streaming.tests.KinesisStreamTests'>]
test_kinesis_stream (pyspark.streaming.tests.KinesisStreamTests) ... Skipped test_kinesis_stream (enable by setting environment variable ENABLE_KINESIS_TESTS=1ok
test_kinesis_stream_api (pyspark.streaming.tests.KinesisStreamTests) ... ok

----------------------------------------------------------------------
Ran 2 tests in 0.592s

OK
('timeout after', 15)
('timeout after', 20)
('timeout after', 20)
('timeout after', 20)
-------------------------------------------
Time: 2018-08-27 19:25:06
-------------------------------------------

-------------------------------------------
Time: 2018-08-27 19:25:08.500000
-------------------------------------------

-------------------------------------------
Time: 2018-08-27 19:25:09
-------------------------------------------

-------------------------------------------
Time: 2018-08-27 19:25:09.500000
-------------------------------------------

-------------------------------------------
Time: 2018-08-27 19:25:10
-------------------------------------------

-------------------------------------------
Time: 2018-08-27 19:25:10.500000
-------------------------------------------

-------------------------------------------
Time: 2018-08-27 19:25:11
-------------------------------------------

-------------------------------------------
Time: 2018-08-27 19:25:11.500000
-------------------------------------------

-------------------------------------------
Time: 2018-08-27 19:25:11.500000
-------------------------------------------

-------------------------------------------
Time: 2018-08-27 19:25:12
-------------------------------------------

-------------------------------------------
Time: 2018-08-27 19:25:12.500000
-------------------------------------------

-------------------------------------------
Time: 2018-08-27 19:25:13
-------------------------------------------

-------------------------------------------
Time: 2018-08-27 19:25:13.500000
-------------------------------------------

-------------------------------------------
Time: 2018-08-27 19:25:14
-------------------------------------------

-------------------------------------------
Time: 2018-08-27 19:25:14.500000
-------------------------------------------

-------------------------------------------
Time: 2018-08-27 19:25:15
-------------------------------------------

-------------------------------------------
Time: 2018-08-27 19:25:15.500000
-------------------------------------------

-------------------------------------------
Time: 2018-08-27 19:25:16
-------------------------------------------

-------------------------------------------
Time: 2018-08-27 19:25:16.500000
-------------------------------------------

('timeout after', 20)

Had test failures in pyspark.streaming.tests with pypy; see logs.
[error] running /home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.7/python/run-tests --parallelism=4 ; received return code 255
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Finished: FAILURE