Regression

org.apache.spark.sql.hive.HiveSparkSubmitSuite.dir

Failing for the past 1 build (Since Failed#5037 )
Took 5 min 9 sec.

Error Message

org.scalatest.exceptions.TestFailedException: Timeout of './bin/spark-submit' '--class' 'org.apache.spark.sql.hive.SetWarehouseLocationTest' '--name' 'SetHiveWarehouseLocationTest' '--master' 'local-cluster[2,1,1024]' '--conf' 'spark.ui.enabled=false' '--conf' 'spark.master.rest.enabled=false' '--conf' 'spark.sql.test.expectedWarehouseDir=/home/jenkins/workspace/NewSparkPullRequestBuilder@5/target/tmp/spark-933ecf4f-2079-4057-93b6-86c95e1baf1d' '--conf' 'spark.driver.extraClassPath=/home/jenkins/workspace/NewSparkPullRequestBuilder@5/target/tmp/spark-63b84922-0b96-4ba0-8835-5bbe790e2f73' '--driver-java-options' '-Dderby.system.durability=test' 'file:/home/jenkins/workspace/NewSparkPullRequestBuilder@5/target/tmp/spark-7a3c4be6-c71b-467d-b9e9-bbaa8e51d601/testJar-1589563434560.jar' See the log4j logs for more detail. 2020-05-15 10:24:45.136 - stderr> SLF4J: Class path contains multiple SLF4J bindings. 2020-05-15 10:24:45.147 - stderr> SLF4J: Found binding in [jar:file:/home/jenkins/workspace/NewSparkPullRequestBuilder@5/assembly/target/scala-2.12/jars/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2020-05-15 10:24:45.147 - stderr> SLF4J: Found binding in [jar:file:/home/sparkivy/per-executor-caches/7/.ivy2/cache/org.slf4j/slf4j-log4j12/jars/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2020-05-15 10:24:45.148 - stderr> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 2020-05-15 10:24:45.148 - stderr> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 2020-05-15 10:25:55.514 - stderr> 20/05/15 10:25:55 INFO SparkContext: Running Spark version 3.0.1-SNAPSHOT 2020-05-15 10:26:02.851 - stderr> 20/05/15 10:26:02 INFO ResourceUtils: ============================================================== 2020-05-15 10:26:02.863 - stderr> 20/05/15 10:26:02 INFO ResourceUtils: Resources for spark.driver: 2020-05-15 10:26:02.863 - stderr>  2020-05-15 10:26:02.863 - stderr> 20/05/15 10:26:02 INFO ResourceUtils: ============================================================== 2020-05-15 10:26:02.864 - stderr> 20/05/15 10:26:02 INFO SparkContext: Submitted application: SetHiveWarehouseLocationTest 2020-05-15 10:26:16.33 - stderr> 20/05/15 10:26:16 INFO SecurityManager: Changing view acls to: jenkins 2020-05-15 10:26:16.331 - stderr> 20/05/15 10:26:16 INFO SecurityManager: Changing modify acls to: jenkins 2020-05-15 10:26:16.331 - stderr> 20/05/15 10:26:16 INFO SecurityManager: Changing view acls groups to:  2020-05-15 10:26:16.331 - stderr> 20/05/15 10:26:16 INFO SecurityManager: Changing modify acls groups to:  2020-05-15 10:26:16.331 - stderr> 20/05/15 10:26:16 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set() 2020-05-15 10:26:24.856 - stderr> 20/05/15 10:26:24 INFO Utils: Successfully started service 'sparkDriver' on port 46662. 2020-05-15 10:26:26.349 - stderr> 20/05/15 10:26:26 INFO SparkEnv: Registering MapOutputTracker 2020-05-15 10:26:29.408 - stderr> 20/05/15 10:26:29 INFO SparkEnv: Registering BlockManagerMaster 2020-05-15 10:26:29.444 - stderr> 20/05/15 10:26:29 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 2020-05-15 10:26:29.445 - stderr> 20/05/15 10:26:29 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 2020-05-15 10:26:29.628 - stderr> 20/05/15 10:26:29 INFO SparkEnv: Registering BlockManagerMasterHeartbeat 2020-05-15 10:28:00.417 - stderr> 20/05/15 10:28:00 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-87f5add7-15d8-4f43-8183-6e4a82bbfc7e 2020-05-15 10:28:07.729 - stderr> 20/05/15 10:28:07 INFO MemoryStore: MemoryStore started with capacity 366.3 MiB 2020-05-15 10:28:18.106 - stderr> 20/05/15 10:28:18 INFO SparkEnv: Registering OutputCommitCoordinator

Stacktrace

sbt.ForkMain$ForkError: org.scalatest.exceptions.TestFailedException: Timeout of './bin/spark-submit' '--class' 'org.apache.spark.sql.hive.SetWarehouseLocationTest' '--name' 'SetHiveWarehouseLocationTest' '--master' 'local-cluster[2,1,1024]' '--conf' 'spark.ui.enabled=false' '--conf' 'spark.master.rest.enabled=false' '--conf' 'spark.sql.test.expectedWarehouseDir=/home/jenkins/workspace/NewSparkPullRequestBuilder@5/target/tmp/spark-933ecf4f-2079-4057-93b6-86c95e1baf1d' '--conf' 'spark.driver.extraClassPath=/home/jenkins/workspace/NewSparkPullRequestBuilder@5/target/tmp/spark-63b84922-0b96-4ba0-8835-5bbe790e2f73' '--driver-java-options' '-Dderby.system.durability=test' 'file:/home/jenkins/workspace/NewSparkPullRequestBuilder@5/target/tmp/spark-7a3c4be6-c71b-467d-b9e9-bbaa8e51d601/testJar-1589563434560.jar' See the log4j logs for more detail.
2020-05-15 10:24:45.136 - stderr> SLF4J: Class path contains multiple SLF4J bindings.
2020-05-15 10:24:45.147 - stderr> SLF4J: Found binding in [jar:file:/home/jenkins/workspace/NewSparkPullRequestBuilder@5/assembly/target/scala-2.12/jars/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
2020-05-15 10:24:45.147 - stderr> SLF4J: Found binding in [jar:file:/home/sparkivy/per-executor-caches/7/.ivy2/cache/org.slf4j/slf4j-log4j12/jars/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
2020-05-15 10:24:45.148 - stderr> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2020-05-15 10:24:45.148 - stderr> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2020-05-15 10:25:55.514 - stderr> 20/05/15 10:25:55 INFO SparkContext: Running Spark version 3.0.1-SNAPSHOT
2020-05-15 10:26:02.851 - stderr> 20/05/15 10:26:02 INFO ResourceUtils: ==============================================================
2020-05-15 10:26:02.863 - stderr> 20/05/15 10:26:02 INFO ResourceUtils: Resources for spark.driver:
2020-05-15 10:26:02.863 - stderr> 
2020-05-15 10:26:02.863 - stderr> 20/05/15 10:26:02 INFO ResourceUtils: ==============================================================
2020-05-15 10:26:02.864 - stderr> 20/05/15 10:26:02 INFO SparkContext: Submitted application: SetHiveWarehouseLocationTest
2020-05-15 10:26:16.33 - stderr> 20/05/15 10:26:16 INFO SecurityManager: Changing view acls to: jenkins
2020-05-15 10:26:16.331 - stderr> 20/05/15 10:26:16 INFO SecurityManager: Changing modify acls to: jenkins
2020-05-15 10:26:16.331 - stderr> 20/05/15 10:26:16 INFO SecurityManager: Changing view acls groups to: 
2020-05-15 10:26:16.331 - stderr> 20/05/15 10:26:16 INFO SecurityManager: Changing modify acls groups to: 
2020-05-15 10:26:16.331 - stderr> 20/05/15 10:26:16 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
2020-05-15 10:26:24.856 - stderr> 20/05/15 10:26:24 INFO Utils: Successfully started service 'sparkDriver' on port 46662.
2020-05-15 10:26:26.349 - stderr> 20/05/15 10:26:26 INFO SparkEnv: Registering MapOutputTracker
2020-05-15 10:26:29.408 - stderr> 20/05/15 10:26:29 INFO SparkEnv: Registering BlockManagerMaster
2020-05-15 10:26:29.444 - stderr> 20/05/15 10:26:29 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
2020-05-15 10:26:29.445 - stderr> 20/05/15 10:26:29 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
2020-05-15 10:26:29.628 - stderr> 20/05/15 10:26:29 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
2020-05-15 10:28:00.417 - stderr> 20/05/15 10:28:00 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-87f5add7-15d8-4f43-8183-6e4a82bbfc7e
2020-05-15 10:28:07.729 - stderr> 20/05/15 10:28:07 INFO MemoryStore: MemoryStore started with capacity 366.3 MiB
2020-05-15 10:28:18.106 - stderr> 20/05/15 10:28:18 INFO SparkEnv: Registering OutputCommitCoordinator
	at org.scalatest.Assertions.newAssertionFailedException(Assertions.scala:530)
	at org.scalatest.Assertions.newAssertionFailedException$(Assertions.scala:529)
	at org.scalatest.FunSuite.newAssertionFailedException(FunSuite.scala:1560)
	at org.scalatest.Assertions.fail(Assertions.scala:1107)
	at org.scalatest.Assertions.fail$(Assertions.scala:1103)
	at org.scalatest.FunSuite.fail(FunSuite.scala:1560)
	at org.apache.spark.sql.hive.SparkSubmitTestUtils.runSparkSubmit(SparkSubmitTestUtils.scala:105)
	at org.apache.spark.sql.hive.SparkSubmitTestUtils.runSparkSubmit$(SparkSubmitTestUtils.scala:41)
	at org.apache.spark.sql.hive.HiveSparkSubmitSuite.runSparkSubmit(HiveSparkSubmitSuite.scala:48)
	at org.apache.spark.sql.hive.HiveSparkSubmitSuite.$anonfun$new$15(HiveSparkSubmitSuite.scala:257)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
	at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
	at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
	at org.scalatest.Transformer.apply(Transformer.scala:22)
	at org.scalatest.Transformer.apply(Transformer.scala:20)
	at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
	at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:151)
	at org.scalatest.FunSuiteLike.invokeWithFixture$1(FunSuiteLike.scala:184)
	at org.scalatest.FunSuiteLike.$anonfun$runTest$1(FunSuiteLike.scala:196)
	at org.scalatest.SuperEngine.runTestImpl(Engine.scala:286)
	at org.scalatest.FunSuiteLike.runTest(FunSuiteLike.scala:196)
	at org.scalatest.FunSuiteLike.runTest$(FunSuiteLike.scala:178)
	at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:58)
	at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:221)
	at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:214)
	at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:58)
	at org.scalatest.FunSuiteLike.$anonfun$runTests$1(FunSuiteLike.scala:229)
	at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:393)
	at scala.collection.immutable.List.foreach(List.scala:392)
	at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:381)
	at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:376)
	at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:458)
	at org.scalatest.FunSuiteLike.runTests(FunSuiteLike.scala:229)
	at org.scalatest.FunSuiteLike.runTests$(FunSuiteLike.scala:228)
	at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
	at org.scalatest.Suite.run(Suite.scala:1124)
	at org.scalatest.Suite.run$(Suite.scala:1106)
	at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
	at org.scalatest.FunSuiteLike.$anonfun$run$1(FunSuiteLike.scala:233)
	at org.scalatest.SuperEngine.runImpl(Engine.scala:518)
	at org.scalatest.FunSuiteLike.run(FunSuiteLike.scala:233)
	at org.scalatest.FunSuiteLike.run$(FunSuiteLike.scala:232)
	at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:58)
	at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
	at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
	at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
	at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:58)
	at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:317)
	at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:510)
	at sbt.ForkMain$Run$2.call(ForkMain.java:296)
	at sbt.ForkMain$Run$2.call(ForkMain.java:286)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: sbt.ForkMain$ForkError: org.scalatest.exceptions.TestFailedDueToTimeoutException: The code passed to failAfter did not complete within 300 seconds.
	at java.lang.Thread.getStackTrace(Thread.java:1559)
	at org.scalatest.concurrent.TimeLimits.failAfterImpl(TimeLimits.scala:234)
	at org.scalatest.concurrent.TimeLimits.failAfterImpl$(TimeLimits.scala:233)
	at org.apache.spark.sql.hive.HiveSparkSubmitSuite.failAfterImpl(HiveSparkSubmitSuite.scala:48)
	at org.scalatest.concurrent.TimeLimits.failAfter(TimeLimits.scala:230)
	at org.scalatest.concurrent.TimeLimits.failAfter$(TimeLimits.scala:229)
	at org.apache.spark.sql.hive.HiveSparkSubmitSuite.failAfter(HiveSparkSubmitSuite.scala:48)
	at org.apache.spark.sql.hive.SparkSubmitTestUtils.runSparkSubmit(SparkSubmitTestUtils.scala:88)
	... 49 more
Caused by: sbt.ForkMain$ForkError: java.lang.InterruptedException: null
	at java.lang.Object.wait(Native Method)
	at java.lang.Object.wait(Object.java:502)
	at java.lang.UNIXProcess.waitFor(UNIXProcess.java:395)
	at org.apache.spark.sql.hive.SparkSubmitTestUtils.$anonfun$runSparkSubmit$5(SparkSubmitTestUtils.scala:88)
	at scala.runtime.java8.JFunction0$mcI$sp.apply(JFunction0$mcI$sp.java:23)
	at org.scalatest.enablers.Timed$$anon$1.timeoutAfter(Timed.scala:127)
	at org.scalatest.concurrent.TimeLimits.failAfterImpl(TimeLimits.scala:239)
	... 55 more