FailedConsole Output

Skipping 6,663 KB.. Full Log
EskT9nMS8dP3gkqLMvHTriiIGKaihyfl5xfk5qXrOEBpkDgMEMDIxMFQUlDDI2RQXJOYpFJdU5qTaKoEttlJQNjBwdjEwsFayAwAsE8VZpQAAAA==- saveAsTable()/load() - non-partitioned table - Append (819 milliseconds)
[info] - saveAsTable()/load() - non-partitioned table - ErrorIfExists (147 milliseconds)
[info] - saveAsTable()/load() - non-partitioned table - Ignore (187 milliseconds)
[info] - saveAsTable()/load() - partitioned table - simple queries (1 second, 512 milliseconds)
[info] - saveAsTable()/load() - partitioned table - boolean type (968 milliseconds)
[info] - saveAsTable()/load() - partitioned table - Overwrite (1 second, 470 milliseconds)
[info] - saveAsTable()/load() - partitioned table - Append (1 second, 529 milliseconds)
[info] - saveAsTable()/load() - partitioned table - Append - new partition values (1 second, 45 milliseconds)
[info] - saveAsTable()/load() - partitioned table - Append - mismatched partition columns (394 milliseconds)
[info] - saveAsTable()/load() - partitioned table - ErrorIfExists (78 milliseconds)
[info] - saveAsTable()/load() - partitioned table - Ignore (21 milliseconds)
[info] - load() - with directory of unpartitioned data in nested subdirs (671 milliseconds)
[info] - Hadoop style globbing - unpartitioned data (1 second, 269 milliseconds)
[info] - Hadoop style globbing - partitioned data with schema inference (1 second, 849 milliseconds)
[info] - SPARK-9735 Partition column type casting (1 second, 408 milliseconds)
[info] - SPARK-7616: adjust column name order accordingly when saving partitioned table (1 second, 53 milliseconds)
[info] - SPARK-8887: Explicitly define which data types can be used as dynamic partition columns (159 milliseconds)
[info] - Locality support for FileScanRDD (316 milliseconds)
[info] - SPARK-16975: Partitioned table with the column having '_' should be read correctly (1 second, 135 milliseconds)
06:49:14.739 WARN org.apache.spark.sql.execution.datasources.DataSource: Found duplicate column(s) in the data schema and the partition schema: `p1`;
[info] - save()/load() - partitioned table - simple queries - partition columns in data (2 seconds, 18 milliseconds)
[info] - SPARK-12218: 'Not' is included in ORC filter pushdown (569 milliseconds)
[info] - SPARK-13543: Support for specifying compression codec for ORC via option() (460 milliseconds)
[info] - Default compression codec is snappy for ORC compression (298 milliseconds)
[info] HiveOrcHadoopFsRelationSuite:
[info] - test all data types (19 seconds, 379 milliseconds)
[info] - save()/load() - non-partitioned table - Overwrite (700 milliseconds)
[info] - save()/load() - non-partitioned table - Append (973 milliseconds)
[info] - save()/load() - non-partitioned table - ErrorIfExists (55 milliseconds)
[info] - save()/load() - non-partitioned table - Ignore (45 milliseconds)
[info] - save()/load() - partitioned table - simple queries (1 second, 984 milliseconds)
[info] - save()/load() - partitioned table - Overwrite (1 second, 190 milliseconds)
[info] - save()/load() - partitioned table - Append (1 second, 269 milliseconds)
[info] - save()/load() - partitioned table - Append - new partition values (840 milliseconds)
[info] - save()/load() - partitioned table - ErrorIfExists (58 milliseconds)
[info] - save()/load() - partitioned table - Ignore (58 milliseconds)
[info] - saveAsTable()/load() - non-partitioned table - Overwrite (448 milliseconds)
[info] - saveAsTable()/load() - non-partitioned table - Append (797 milliseconds)
[info] - saveAsTable()/load() - non-partitioned table - ErrorIfExists (156 milliseconds)
[info] - saveAsTable()/load() - non-partitioned table - Ignore (176 milliseconds)
[info] - saveAsTable()/load() - partitioned table - simple queries (1 second, 559 milliseconds)
[info] - saveAsTable()/load() - partitioned table - boolean type (1 second, 744 milliseconds)
[info] - saveAsTable()/load() - partitioned table - Overwrite (1 second, 399 milliseconds)
[info] - saveAsTable()/load() - partitioned table - Append (1 second, 441 milliseconds)
[info] - saveAsTable()/load() - partitioned table - Append - new partition values (976 milliseconds)
[info] - saveAsTable()/load() - partitioned table - Append - mismatched partition columns (381 milliseconds)
[info] - saveAsTable()/load() - partitioned table - ErrorIfExists (75 milliseconds)
[info] - saveAsTable()/load() - partitioned table - Ignore (20 milliseconds)
[info] - load() - with directory of unpartitioned data in nested subdirs (581 milliseconds)
[info] - Hadoop style globbing - unpartitioned data (1 second, 272 milliseconds)
[info] - Hadoop style globbing - partitioned data with schema inference (1 second, 824 milliseconds)
[info] - SPARK-9735 Partition column type casting (1 second, 244 milliseconds)
[info] - SPARK-7616: adjust column name order accordingly when saving partitioned table (922 milliseconds)
[info] - SPARK-8887: Explicitly define which data types can be used as dynamic partition columns (102 milliseconds)
[info] - Locality support for FileScanRDD (301 milliseconds)
[info] - SPARK-16975: Partitioned table with the column having '_' should be read correctly (1 second, 124 milliseconds)
06:50:01.317 WARN org.apache.spark.sql.execution.datasources.DataSource: Found duplicate column(s) in the data schema and the partition schema: `p1`;
[info] - save()/load() - partitioned table - simple queries - partition columns in data (2 seconds, 74 milliseconds)
[info] - SPARK-12218: 'Not' is included in ORC filter pushdown (539 milliseconds)
[info] - SPARK-13543: Support for specifying compression codec for ORC via option() (377 milliseconds)
[info] - Default compression codec is snappy for ORC compression (261 milliseconds)
[info] HiveSessionStateSuite:
06:50:03.752 WARN org.apache.spark.sql.SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
[info] - fork new session and inherit RuntimeConfig options (3 milliseconds)
[info] - fork new session and inherit function registry and udf (8 milliseconds)
[info] - fork new session and inherit experimental methods (4 milliseconds)
[info] - fork new session and inherit listener manager (188 milliseconds)
Build timed out (after 300 minutes). Marking the build as failed.
Build was aborted
Archiving artifacts
[info] - lateral_view *** FAILED *** (4 hours, 10 minutes, 56 seconds)
[info]   java.lang.NullPointerException:
[info]   at org.apache.spark.sql.internal.SQLConf$.$anonfun$get$1(SQLConf.scala:134)
[info]   at scala.Option.map(Option.scala:163)
[info]   at org.apache.spark.sql.internal.SQLConf$.get(SQLConf.scala:134)
[info]   at org.apache.spark.sql.catalyst.util.StringUtils$PlanStringConcat.<init>(StringUtils.scala:141)
[info]   at org.apache.spark.sql.execution.QueryExecution.toString(QueryExecution.scala:157)
[info]   at java.lang.String.valueOf(String.java:2994)
[info]   at java.lang.StringBuilder.append(StringBuilder.java:131)
[info]   at org.apache.spark.sql.hive.execution.HiveComparisonTest.$anonfun$createQueryTest$31(HiveComparisonTest.scala:360)
[info]   at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:237)
[info]   at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
[info]   at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
[info]   at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
[info]   at scala.collection.TraversableLike.map(TraversableLike.scala:237)
[info]   at scala.collection.TraversableLike.map$(TraversableLike.scala:230)
[info]   at scala.collection.AbstractTraversable.map(Traversable.scala:108)
[info]   at org.apache.spark.sql.hive.execution.HiveComparisonTest.doTest$1(HiveComparisonTest.scala:346)
[info]   at org.apache.spark.sql.hive.execution.HiveComparisonTest.$anonfun$createQueryTest$10(HiveComparisonTest.scala:464)
[info]   at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
[info]   at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
[info]   at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
[info]   at org.scalatest.Transformer.apply(Transformer.scala:22)
[info]   at org.scalatest.Transformer.apply(Transformer.scala:20)
[info]   at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
[info]   at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:149)
[info]   at org.scalatest.FunSuiteLike.invokeWithFixture$1(FunSuiteLike.scala:184)
[info]   at org.scalatest.FunSuiteLike.$anonfun$runTest$1(FunSuiteLike.scala:196)
[info]   at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
[info]   at org.scalatest.FunSuiteLike.runTest(FunSuiteLike.scala:196)
[info]   at org.scalatest.FunSuiteLike.runTest$(FunSuiteLike.scala:178)
[info]   at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:56)
[info]   at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:221)
[info]   at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:214)
[info]   at org.apache.spark.sql.hive.execution.HiveCompatibilitySuite.org$scalatest$BeforeAndAfter$$super$runTest(HiveCompatibilitySuite.scala:33)
[info]   at org.scalatest.BeforeAndAfter.runTest(BeforeAndAfter.scala:203)
[info]   at org.scalatest.BeforeAndAfter.runTest$(BeforeAndAfter.scala:192)
[info]   at org.apache.spark.sql.hive.execution.HiveCompatibilitySuite.runTest(HiveCompatibilitySuite.scala:33)
[info]   at org.scalatest.FunSuiteLike.$anonfun$runTests$1(FunSuiteLike.scala:229)
[info]   at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:396)
[info]   at scala.collection.immutable.List.foreach(List.scala:392)
[info]   at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
[info]   at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:379)
[info]   at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
[info]   at org.scalatest.FunSuiteLike.runTests(FunSuiteLike.scala:229)
[info]   at org.scalatest.FunSuiteLike.runTests$(FunSuiteLike.scala:228)
[info]   at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
[info]   at org.scalatest.Suite.run(Suite.scala:1147)
[info]   at org.scalatest.Suite.run$(Suite.scala:1129)
[info]   at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
[info]   at org.scalatest.FunSuiteLike.$anonfun$run$1(FunSuiteLike.scala:233)
[info]   at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
[info]   at org.scalatest.FunSuiteLike.run(FunSuiteLike.scala:233)
[info]   at org.scalatest.FunSuiteLike.run$(FunSuiteLike.scala:232)
[info]   at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:56)
[info]   at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
[info]   at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
[info]   at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
[info]   at org.apache.spark.sql.hive.execution.HiveCompatibilitySuite.org$scalatest$BeforeAndAfter$$super$run(HiveCompatibilitySuite.scala:33)
[info]   at org.scalatest.BeforeAndAfter.run(BeforeAndAfter.scala:258)
[info]   at org.scalatest.BeforeAndAfter.run$(BeforeAndAfter.scala:256)
[info]   at org.apache.spark.sql.hive.execution.HiveCompatibilitySuite.run(HiveCompatibilitySuite.scala:33)
[info]   at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:314)
[info]   at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:507)
[info]   at sbt.ForkMain$Run$2.call(ForkMain.java:296)
[info]   at sbt.ForkMain$Run$2.call(ForkMain.java:286)
[info]   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[info]   at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[info]   at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[info]   at java.lang.Thread.run(Thread.java:748)
[info] - lateral_view_noalias *** FAILED *** (2 milliseconds)
[info]   java.lang.NullPointerException:
[info]   at org.apache.spark.sql.internal.SQLConf$.$anonfun$get$1(SQLConf.scala:134)
[info]   at scala.Option.map(Option.scala:163)
[info]   at org.apache.spark.sql.internal.SQLConf$.get(SQLConf.scala:134)
[info]   at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:94)
[info]   at org.apache.spark.sql.execution.SparkSqlParser.parse(SparkSqlParser.scala:48)
[info]   at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parsePlan(ParseDriver.scala:76)
[info]   at org.apache.spark.sql.hive.test.TestHiveQueryExecution.<init>(TestHive.scala:582)
[info]   at org.apache.spark.sql.hive.test.TestHiveQueryExecution.<init>(TestHive.scala:586)
[info]   at org.apache.spark.sql.hive.execution.HiveComparisonTest.$anonfun$createQueryTest$31(HiveComparisonTest.scala:347)
[info]   at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:237)
[info]   at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
[info]   at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
[info]   at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
[info]   at scala.collection.TraversableLike.map(TraversableLike.scala:237)
[info]   at scala.collection.TraversableLike.map$(TraversableLike.scala:230)
[info]   at scala.collection.AbstractTraversable.map(Traversable.scala:108)
[info]   at org.apache.spark.sql.hive.execution.HiveComparisonTest.doTest$1(HiveComparisonTest.scala:346)
[info]   at org.apache.spark.sql.hive.execution.HiveComparisonTest.$anonfun$createQueryTest$10(HiveComparisonTest.scala:462)
[info]   at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
[info]   at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
[info]   at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
[info]   at org.scalatest.Transformer.apply(Transformer.scala:22)
[info]   at org.scalatest.Transformer.apply(Transformer.scala:20)
[info]   at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
[info]   at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:149)
[info]   at org.scalatest.FunSuiteLike.invokeWithFixture$1(FunSuiteLike.scala:184)
[info]   at org.scalatest.FunSuiteLike.$anonfun$runTest$1(FunSuiteLike.scala:196)
[info]   at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
[info]   at org.scalatest.FunSuiteLike.runTest(FunSuiteLike.scala:196)
[info]   at org.scalatest.FunSuiteLike.runTest$(FunSuiteLike.scala:178)
[info]   at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:56)
[info]   at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:221)
[info]   at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:214)
[info]   at org.apache.spark.sql.hive.execution.HiveCompatibilitySuite.org$scalatest$BeforeAndAfter$$super$runTest(HiveCompatibilitySuite.scala:33)
[info]   at org.scalatest.BeforeAndAfter.runTest(BeforeAndAfter.scala:203)
[info]   at org.scalatest.BeforeAndAfter.runTest$(BeforeAndAfter.scala:192)
[info]   at org.apache.spark.sql.hive.execution.HiveCompatibilitySuite.runTest(HiveCompatibilitySuite.scala:33)
[info]   at org.scalatest.FunSuiteLike.$anonfun$runTests$1(FunSuiteLike.scala:229)
[info]   at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:396)
[info]   at scala.collection.immutable.List.foreach(List.scala:392)
[info]   at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
[info]   at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:379)
[info]   at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
[info]   at org.scalatest.FunSuiteLike.runTests(FunSuiteLike.scala:229)
[info]   at org.scalatest.FunSuiteLike.runTests$(FunSuiteLike.scala:228)
[info]   at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
[info]   at org.scalatest.Suite.run(Suite.scala:1147)
[info]   at org.scalatest.Suite.run$(Suite.scala:1129)
[info]   at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
[info]   at org.scalatest.FunSuiteLike.$anonfun$run$1(FunSuiteLike.scala:233)
[info]   at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
[info]   at org.scalatest.FunSuiteLike.run(FunSuiteLike.scala:233)
[info]   at org.scalatest.FunSuiteLike.run$(FunSuiteLike.scala:232)
[info]   at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:56)
[info]   at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
[info]   at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
[info]   at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
[info]   at org.apache.spark.sql.hive.execution.HiveCompatibilitySuite.org$scalatest$BeforeAndAfter$$super$run(HiveCompatibilitySuite.scala:33)
[info]   at org.scalatest.BeforeAndAfter.run(BeforeAndAfter.scala:258)
[info]   at org.scalatest.BeforeAndAfter.run$(BeforeAndAfter.scala:256)
[info]   at org.apache.spark.sql.hive.execution.HiveCompatibilitySuite.run(HiveCompatibilitySuite.scala:33)
[info]   at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:314)
[info]   at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:507)
[info]   at sbt.ForkMain$Run$2.call(ForkMain.java:296)
[info]   at sbt.ForkMain$Run$2.call(ForkMain.java:286)
[info]   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[info]   at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[info]   at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[info]   at java.lang.Thread.run(Thread.java:748)
[info] - lateral_view_ppd *** FAILED *** (0 milliseconds)
[info]   java.lang.NullPointerException:
[info]   at org.apache.spark.sql.internal.SQLConf$.$anonfun$get$1(SQLConf.scala:134)
[info]   at scala.Option.map(Option.scala:163)
[info]   at org.apache.spark.sql.internal.SQLConf$.get(SQLConf.scala:134)
[info]   at org.apache.spark.sql.catalyst.util.StringUtils$PlanStringConcat.<init>(StringUtils.scala:141)
[info]   at org.apache.spark.sql.execution.QueryExecution.toString(QueryExecution.scala:157)
[info]   at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$4(SQLExecution.scala:95)
[info]   at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:160)
[info]   at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:87)
[info]   at org.apache.spark.sql.hive.test.TestHiveSparkSession.loadTestTable(TestHive.scala:505)
[info]   at org.apache.spark.sql.hive.test.TestHiveContext.loadTestTable(TestHive.scala:153)
[info]   at org.apache.spark.sql.hive.execution.HiveComparisonTest.$anonfun$createQueryTest$23(HiveComparisonTest.scala:312)
[info]   at org.apache.spark.sql.hive.execution.HiveComparisonTest.$anonfun$createQueryTest$23$adapted(HiveComparisonTest.scala:304)
[info]   at scala.collection.immutable.List.foreach(List.scala:392)
[info]   at org.apache.spark.sql.hive.execution.HiveComparisonTest.doTest$1(HiveComparisonTest.scala:304)
[info]   at org.apache.spark.sql.hive.execution.HiveComparisonTest.$anonfun$createQueryTest$10(HiveComparisonTest.scala:462)
[info]   at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
[info]   at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
[info]   at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
[info]   at org.scalatest.Transformer.apply(Transformer.scala:22)
[info]   at org.scalatest.Transformer.apply(Transformer.scala:20)
[info]   at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
[info]   at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:149)
[info]   at org.scalatest.FunSuiteLike.invokeWithFixture$1(FunSuiteLike.scala:184)
[info]   at org.scalatest.FunSuiteLike.$anonfun$runTest$1(FunSuiteLike.scala:196)
[info]   at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
[info]   at org.scalatest.FunSuiteLike.runTest(FunSuiteLike.scala:196)
[info]   at org.scalatest.FunSuiteLike.runTest$(FunSuiteLike.scala:178)
[info]   at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:56)
[info]   at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:221)
[info]   at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:214)
[info]   at org.apache.spark.sql.hive.execution.HiveCompatibilitySuite.org$scalatest$BeforeAndAfter$$super$runTest(HiveCompatibilitySuite.scala:33)
[info]   at org.scalatest.BeforeAndAfter.runTest(BeforeAndAfter.scala:203)
[info]   at org.scalatest.BeforeAndAfter.runTest$(BeforeAndAfter.scala:192)
[info]   at org.apache.spark.sql.hive.execution.HiveCompatibilitySuite.runTest(HiveCompatibilitySuite.scala:33)
[info]   at org.scalatest.FunSuiteLike.$anonfun$runTests$1(FunSuiteLike.scala:229)
[info]   at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:396)
[info]   at scala.collection.immutable.List.foreach(List.scala:392)
[info]   at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
[info]   at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:379)
[info]   at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
[info]   at org.scalatest.FunSuiteLike.runTests(FunSuiteLike.scala:229)
[info]   at org.scalatest.FunSuiteLike.runTests$(FunSuiteLike.scala:228)
[info]   at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
[info]   at org.scalatest.Suite.run(Suite.scala:1147)
[info]   at org.scalatest.Suite.run$(Suite.scala:1129)
[info]   at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
[info]   at org.scalatest.FunSuiteLike.$anonfun$run$1(FunSuiteLike.scala:233)
[info]   at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
[info]   at org.scalatest.FunSuiteLike.run(FunSuiteLike.scala:233)
[info]   at org.scalatest.FunSuiteLike.run$(FunSuiteLike.scala:232)
[info]   at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:56)
[info]   at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
[info]   at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
[info]   at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
[info]   at org.apache.spark.sql.hive.execution.HiveCompatibilitySuite.org$scalatest$BeforeAndAfter$$super$run(HiveCompatibilitySuite.scala:33)
[info]   at org.scalatest.BeforeAndAfter.run(BeforeAndAfter.scala:258)
[info]   at org.scalatest.BeforeAndAfter.run$(BeforeAndAfter.scala:256)
[info]   at org.apache.spark.sql.hive.execution.HiveCompatibilitySuite.run(HiveCompatibilitySuite.scala:33)
[info]   at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:314)
[info]   at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:507)
[info]   at sbt.ForkMain$Run$2.call(ForkMain.java:296)
[info]   at sbt.ForkMain$Run$2.call(ForkMain.java:286)
[info]   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[info]   at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[info]   at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[info]   at java.lang.Thread.run(Thread.java:748)
[info] - lb_fs_stats !!! IGNORED !!!
[info] - leadlag !!! IGNORED !!!
[info] - leadlag_queries !!! IGNORED !!!
Exception in thread "block-manager-slave-async-thread-pool-203" java.lang.Error: java.lang.InterruptedException
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.InterruptedException
	at java.lang.Object.wait(Native Method)
	at java.lang.Object.wait(Object.java:502)
	at org.apache.spark.storage.BlockInfoManager.lockForWriting(BlockInfoManager.scala:236)
	at org.apache.spark.storage.BlockManager.removeBlock(BlockManager.scala:1713)
	at org.apache.spark.storage.BlockManager.$anonfun$removeRdd$4(BlockManager.scala:1692)
	at org.apache.spark.storage.BlockManager.$anonfun$removeRdd$4$adapted(BlockManager.scala:1692)
	at scala.collection.Iterator.foreach(Iterator.scala:941)
	at scala.collection.Iterator.foreach$(Iterator.scala:941)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
	at org.apache.spark.storage.BlockManager.removeRdd(BlockManager.scala:1692)
	at org.apache.spark.storage.BlockManagerSlaveEndpoint$$anonfun$receiveAndReply$1.$anonfun$applyOrElse$2(BlockManagerSlaveEndpoint.scala:53)
	at scala.runtime.java8.JFunction0$mcI$sp.apply(JFunction0$mcI$sp.java:23)
	at org.apache.spark.storage.BlockManagerSlaveEndpoint.$anonfun$doAsync$1(BlockManagerSlaveEndpoint.scala:86)
	at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:658)
	at scala.util.Success.$anonfun$map$1(Try.scala:255)
	at scala.util.Success.map(Try.scala:213)
	at scala.concurrent.Future.$anonfun$map$1(Future.scala:292)
	at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33)
	at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	... 2 more
Exception in thread "Thread-71" java.io.EOFException
	at java.io.ObjectInputStream$BlockDataInputStream.peekByte(ObjectInputStream.java:2960)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1540)
	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
	at org.scalatest.tools.Framework$ScalaTestRunner$Skeleton$1$React.react(Framework.scala:818)
	at org.scalatest.tools.Framework$ScalaTestRunner$Skeleton$1.run(Framework.scala:807)
	at java.lang.Thread.run(Thread.java:748)
ERROR: Failed to archive artifacts: **/target/unit-tests.log,python/unit-tests.log
java.io.IOException: java.io.IOException: Failed to extract /home/jenkins/workspace/NewSparkPullRequestBuilder/transfer of 25 files
	at hudson.FilePath.readFromTar(FilePath.java:2300)
	at hudson.FilePath.copyRecursiveTo(FilePath.java:2209)
	at jenkins.model.StandardArtifactManager.archive(StandardArtifactManager.java:61)
	at hudson.tasks.ArtifactArchiver.perform(ArtifactArchiver.java:236)
	at hudson.tasks.BuildStepCompatibilityLayer.perform(BuildStepCompatibilityLayer.java:78)
	at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
	at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:782)
	at hudson.model.AbstractBuild$AbstractBuildExecution.performAllBuildSteps(AbstractBuild.java:723)
	at hudson.model.Build$BuildExecution.post2(Build.java:185)
	at hudson.model.AbstractBuild$AbstractBuildExecution.post(AbstractBuild.java:668)
	at hudson.model.Run.execute(Run.java:1763)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
	at hudson.model.ResourceController.execute(ResourceController.java:98)
	at hudson.model.Executor.run(Executor.java:410)
Caused by: java.io.EOFException: Unexpected end of ZLIB input stream
	at com.jcraft.jzlib.InflaterInputStream.fill(InflaterInputStream.java:186)
	at com.jcraft.jzlib.InflaterInputStream.read(InflaterInputStream.java:106)
	at org.apache.commons.compress.archivers.tar.TarArchiveInputStream.read(TarArchiveInputStream.java:614)
	at java.io.InputStream.read(InputStream.java:101)
	at org.apache.commons.io.IOUtils.copyLarge(IOUtils.java:1792)
	at org.apache.commons.io.IOUtils.copyLarge(IOUtils.java:1769)
	at org.apache.commons.io.IOUtils.copy(IOUtils.java:1744)
	at hudson.util.IOUtils.copy(IOUtils.java:40)
	at hudson.FilePath.readFromTar(FilePath.java:2290)
	... 13 more

	at hudson.FilePath.copyRecursiveTo(FilePath.java:2216)
	at jenkins.model.StandardArtifactManager.archive(StandardArtifactManager.java:61)
	at hudson.tasks.ArtifactArchiver.perform(ArtifactArchiver.java:236)
	at hudson.tasks.BuildStepCompatibilityLayer.perform(BuildStepCompatibilityLayer.java:78)
	at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
	at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:782)
	at hudson.model.AbstractBuild$AbstractBuildExecution.performAllBuildSteps(AbstractBuild.java:723)
	at hudson.model.Build$BuildExecution.post2(Build.java:185)
	at hudson.model.AbstractBuild$AbstractBuildExecution.post(AbstractBuild.java:668)
	at hudson.model.Run.execute(Run.java:1763)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
	at hudson.model.ResourceController.execute(ResourceController.java:98)
	at hudson.model.Executor.run(Executor.java:410)
Caused by: java.util.concurrent.ExecutionException: java.io.IOException: This archives contains unclosed entries.
	at hudson.remoting.Channel$2.adapt(Channel.java:813)
	at hudson.remoting.Channel$2.adapt(Channel.java:808)
	at hudson.remoting.FutureAdapter.get(FutureAdapter.java:59)
	at hudson.FilePath.copyRecursiveTo(FilePath.java:2212)
	... 12 more
Caused by: java.io.IOException: This archives contains unclosed entries.
	at org.apache.commons.compress.archivers.tar.TarArchiveOutputStream.finish(TarArchiveOutputStream.java:225)
	at org.apache.commons.compress.archivers.tar.TarArchiveOutputStream.close(TarArchiveOutputStream.java:241)
	at hudson.util.io.TarArchiver.close(TarArchiver.java:111)
	at hudson.FilePath.writeToTar(FilePath.java:2263)
	at hudson.FilePath.access$2100(FilePath.java:190)
	at hudson.FilePath$45.invoke(FilePath.java:2202)
	at hudson.FilePath$45.invoke(FilePath.java:2198)
	at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2719)
	at hudson.remoting.UserRequest.perform(UserRequest.java:120)
	at hudson.remoting.UserRequest.perform(UserRequest.java:48)
	at hudson.remoting.Request$2.run(Request.java:326)
	at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:68)
	at java.util.concurrent.FutureTask.run(FutureTask.java:262)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:745)
	at ......remote call to amp-jenkins-worker-05(Native Method)
	at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1416)
	at hudson.remoting.UserResponse.retrieve(UserRequest.java:220)
	at hudson.remoting.Channel$2.adapt(Channel.java:811)
	... 15 more
Recording test results
Finished: FAILURE