FailedConsole Output

Skipping 14,807 KB.. Full Log
6zEskT9nMS8dP3gkqLMvHTriiIGKaihyfl5xfk5qXrOEBpkDgMEMDIxMFQUlDCw2+gXFyTm2QEAI9P8iI4AAAA=.applySchemaToJSON started
[info] Test run finished: 0 failed, 0 ignored, 3 total, 0.787s
[info] Test run started
[info] Test test.org.apache.spark.sql.JavaRowSuite.constructSimpleRow started
[info] Test test.org.apache.spark.sql.JavaRowSuite.constructComplexRow started
[info] Test run finished: 0 failed, 0 ignored, 2 total, 0.002s
[info] Test run started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testRuntimeNullabilityCheck started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testCircularReferenceBean1 started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testCircularReferenceBean2 started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testCircularReferenceBean3 started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testSerializeNull started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testRandomSplit started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testTypedFilterPreservingSchema started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testLocalDateAndInstantEncoders started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testJoin started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testTake started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testToLocalIterator started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testSpecificLists started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testForeach started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testJavaEncoder started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testPrimitiveEncoder started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testEmptyBean started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testCommonOperation started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testNullInTopLevelBean started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testGroupBy started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testSetOperation started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testBeanWithEnum started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testKryoEncoder started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.test started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testJavaBeanEncoder2 started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testCollect started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testKryoEncoderErrorMessageForPrivateClass started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testJavaBeanEncoder started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testTupleEncoder started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testNestedTupleEncoder started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testTupleEncoderSchema started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testReduce started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testSelect started
[info] Test test.org.apache.spark.sql.JavaDatasetSuite.testJavaEncoderErrorMessageForPrivateClass started
[info] Test run finished: 0 failed, 0 ignored, 33 total, 21.786s
[info] Test run started
[info] Test test.org.apache.spark.sql.Java8DatasetAggregatorSuite.testTypedAggregationCount started
[info] Test test.org.apache.spark.sql.Java8DatasetAggregatorSuite.testTypedAggregationSumDouble started
[info] Test test.org.apache.spark.sql.Java8DatasetAggregatorSuite.testTypedAggregationSumLong started
[info] Test test.org.apache.spark.sql.Java8DatasetAggregatorSuite.testTypedAggregationAverage started
[info] Test run finished: 0 failed, 0 ignored, 4 total, 2.594s
[info] Test run started
[info] Test test.org.apache.spark.sql.JavaSaveLoadSuite.saveAndLoadWithSchema started
[info] Test test.org.apache.spark.sql.JavaSaveLoadSuite.saveAndLoad started
[info] Test run finished: 0 failed, 0 ignored, 2 total, 1.134s
[info] Test run started
[info] Test test.org.apache.spark.sql.JavaDataFrameReaderWriterSuite.testFormatAPI started
[info] Test test.org.apache.spark.sql.JavaDataFrameReaderWriterSuite.testTextAPI started
09:59:55.114 WARN org.apache.spark.sql.execution.datasources.DataSource: All paths were ignored:
  
[info] Test test.org.apache.spark.sql.JavaDataFrameReaderWriterSuite.testJsonAPI started
09:59:55.329 WARN org.apache.spark.sql.execution.datasources.DataSource: All paths were ignored:
  
[info] Test test.org.apache.spark.sql.JavaDataFrameReaderWriterSuite.testLoadAPI started
[info] Test test.org.apache.spark.sql.JavaDataFrameReaderWriterSuite.testOptionsAPI started
09:59:55.663 WARN org.apache.spark.sql.execution.datasources.DataSource: All paths were ignored:
  
[info] Test test.org.apache.spark.sql.JavaDataFrameReaderWriterSuite.testSaveModeAPI started
[info] Test test.org.apache.spark.sql.JavaDataFrameReaderWriterSuite.testCsvAPI started
09:59:55.898 WARN org.apache.spark.sql.execution.datasources.DataSource: All paths were ignored:
  
[info] Test test.org.apache.spark.sql.JavaDataFrameReaderWriterSuite.testParquetAPI started
09:59:56.181 WARN org.apache.spark.sql.execution.datasources.DataSource: All paths were ignored:
  
[info] Test test.org.apache.spark.sql.JavaDataFrameReaderWriterSuite.testTextFileAPI started
09:59:56.501 WARN org.apache.spark.sql.execution.datasources.DataSource: All paths were ignored:
  
[info] Test run finished: 0 failed, 0 ignored, 9 total, 1.618s
[info] Test run started
[info] Test test.org.apache.spark.sql.JavaDataFrameSuite.testCollectAndTake started
[info] Test test.org.apache.spark.sql.JavaDataFrameSuite.testJsonRDDToDataFrame started
[info] Test test.org.apache.spark.sql.JavaDataFrameSuite.testVarargMethods started
[info] Test test.org.apache.spark.sql.JavaDataFrameSuite.testBeanWithoutGetter started
[info] Test test.org.apache.spark.sql.JavaDataFrameSuite.testCreateStructTypeFromList started
[info] Test test.org.apache.spark.sql.JavaDataFrameSuite.testSampleBy started
[info] Test test.org.apache.spark.sql.JavaDataFrameSuite.testCrosstab started
[info] Test test.org.apache.spark.sql.JavaDataFrameSuite.testUDF started
[info] Test test.org.apache.spark.sql.JavaDataFrameSuite.testCreateDataFromFromList started
[info] Test test.org.apache.spark.sql.JavaDataFrameSuite.testCircularReferenceBean started
[info] Test test.org.apache.spark.sql.JavaDataFrameSuite.testFrequentItems started
[info] Test test.org.apache.spark.sql.JavaDataFrameSuite.testSampleByColumn started
[info] Test test.org.apache.spark.sql.JavaDataFrameSuite.testExecution started
[info] Test test.org.apache.spark.sql.JavaDataFrameSuite.testTextLoad started
[info] Test test.org.apache.spark.sql.JavaDataFrameSuite.pivot started
[info] Test test.org.apache.spark.sql.JavaDataFrameSuite.testGenericLoad started
[info] Test test.org.apache.spark.sql.JavaDataFrameSuite.testCountMinSketch started
[info] Test test.org.apache.spark.sql.JavaDataFrameSuite.pivotColumnValues started
[info] Test test.org.apache.spark.sql.JavaDataFrameSuite.testCreateDataFrameFromJavaBeans started
[info] Test test.org.apache.spark.sql.JavaDataFrameSuite.testCorrelation started
[info] Test test.org.apache.spark.sql.JavaDataFrameSuite.testBloomFilter started
[info] Test test.org.apache.spark.sql.JavaDataFrameSuite.testCovariance started
[info] Test test.org.apache.spark.sql.JavaDataFrameSuite.testCreateDataFrameFromLocalJavaBeans started
[info] Test run finished: 0 failed, 0 ignored, 23 total, 13.759s
[info] Test run started
[info] Test test.org.apache.spark.sql.streaming.JavaDataStreamReaderWriterSuite.testForeachBatchAPI started
10:00:10.411 WARN org.apache.spark.sql.streaming.StreamingQueryManager: Temporary checkpoint location created which is deleted normally when the query didn't fail: /home/jenkins/workspace/NewSparkPullRequestBuilder/target/tmp/temporary-1417fd91-cb16-49b8-83ae-f0a22c1276df. If it's required to delete it under any circumstances, please set spark.sql.streaming.forceDeleteTempCheckpointLocation to true. Important to know deleting temp checkpoint folder is best effort.
10:00:10.434 WARN org.apache.hadoop.util.Shell: Interrupted while joining on: Thread[Thread-79416,5,]
java.lang.InterruptedException
	at java.lang.Object.wait(Native Method)
	at java.lang.Thread.join(Thread.java:1252)
	at java.lang.Thread.join(Thread.java:1326)
	at org.apache.hadoop.util.Shell.joinThread(Shell.java:629)
	at org.apache.hadoop.util.Shell.runCommand(Shell.java:580)
	at org.apache.hadoop.util.Shell.run(Shell.java:482)
	at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:776)
	at org.apache.hadoop.util.Shell.execCommand(Shell.java:869)
	at org.apache.hadoop.util.Shell.execCommand(Shell.java:852)
	at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:733)
	at org.apache.hadoop.fs.RawLocalFileSystem.mkOneDirWithMode(RawLocalFileSystem.java:491)
	at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:532)
	at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:509)
	at org.apache.hadoop.fs.FileSystem.primitiveMkdir(FileSystem.java:1066)
	at org.apache.hadoop.fs.DelegateToFileSystem.mkdir(DelegateToFileSystem.java:183)
	at org.apache.hadoop.fs.FilterFs.mkdir(FilterFs.java:201)
	at org.apache.hadoop.fs.FileContext$4.next(FileContext.java:730)
	at org.apache.hadoop.fs.FileContext$4.next(FileContext.java:726)
	at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90)
	at org.apache.hadoop.fs.FileContext.mkdir(FileContext.java:733)
	at org.apache.spark.sql.execution.streaming.FileContextBasedCheckpointFileManager.mkdirs(CheckpointFileManager.scala:303)
	at org.apache.spark.sql.execution.streaming.HDFSMetadataLog.<init>(HDFSMetadataLog.scala:66)
	at org.apache.spark.sql.execution.streaming.CompactibleFileStreamLog.<init>(CompactibleFileStreamLog.scala:46)
	at org.apache.spark.sql.execution.streaming.FileStreamSourceLog.<init>(FileStreamSourceLog.scala:36)
	at org.apache.spark.sql.execution.streaming.FileStreamSource.<init>(FileStreamSource.scala:64)
	at org.apache.spark.sql.execution.datasources.DataSource.createSource(DataSource.scala:290)
	at org.apache.spark.sql.execution.streaming.MicroBatchExecution$$anonfun$1.$anonfun$applyOrElse$1(MicroBatchExecution.scala:86)
	at scala.collection.mutable.HashMap.getOrElseUpdate(HashMap.scala:86)
	at org.apache.spark.sql.execution.streaming.MicroBatchExecution$$anonfun$1.applyOrElse(MicroBatchExecution.scala:83)
	at org.apache.spark.sql.execution.streaming.MicroBatchExecution$$anonfun$1.applyOrElse(MicroBatchExecution.scala:81)
	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDown$1(TreeNode.scala:282)
	at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:72)
	at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:282)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDown(LogicalPlan.scala:29)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDown(AnalysisHelper.scala:149)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDown$(AnalysisHelper.scala:147)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDown(LogicalPlan.scala:29)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDown(LogicalPlan.scala:29)
	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDown$3(TreeNode.scala:287)
	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$mapChildren$1(TreeNode.scala:372)
	at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:210)
	at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:370)
	at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:323)
	at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:287)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDown(LogicalPlan.scala:29)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDown(AnalysisHelper.scala:149)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDown$(AnalysisHelper.scala:147)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDown(LogicalPlan.scala:29)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDown(LogicalPlan.scala:29)
	at org.apache.spark.sql.catalyst.trees.TreeNode.transform(TreeNode.scala:271)
	at org.apache.spark.sql.execution.streaming.MicroBatchExecution.logicalPlan$lzycompute(MicroBatchExecution.scala:81)
	at org.apache.spark.sql.execution.streaming.MicroBatchExecution.logicalPlan(MicroBatchExecution.scala:61)
	at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:320)
	at org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:244)
[info] Test test.org.apache.spark.sql.streaming.JavaDataStreamReaderWriterSuite.testForeachAPI started
10:00:10.524 WARN org.apache.spark.sql.streaming.StreamingQueryManager: Temporary checkpoint location created which is deleted normally when the query didn't fail: /home/jenkins/workspace/NewSparkPullRequestBuilder/target/tmp/temporary-77adb04e-9c72-4bb9-9fcb-ccf10bfeddb6. If it's required to delete it under any circumstances, please set spark.sql.streaming.forceDeleteTempCheckpointLocation to true. Important to know deleting temp checkpoint folder is best effort.
[info] Test run finished: 0 failed, 0 ignored, 2 total, 0.248s
[info] Test run started
[info] Test test.org.apache.spark.sql.JavaDatasetAggregatorSuite.testTypedAggregationCount started
[info] Test test.org.apache.spark.sql.JavaDatasetAggregatorSuite.testTypedAggregationSumDouble started
[info] Test test.org.apache.spark.sql.JavaDatasetAggregatorSuite.testTypedAggregationSumLong started
[info] Test test.org.apache.spark.sql.JavaDatasetAggregatorSuite.testTypedAggregationAnonClass started
[info] Test test.org.apache.spark.sql.JavaDatasetAggregatorSuite.testTypedAggregationAverage started
[info] Test run finished: 0 failed, 0 ignored, 5 total, 3.755s
[info] Test run started
[info] Test test.org.apache.spark.sql.JavaUDAFSuite.udf1Test started
[info] Test run finished: 0 failed, 0 ignored, 1 total, 1.333s
[info] Test run started
[info] Test test.org.apache.spark.sql.JavaBeanDeserializationSuite.testBeanWithArrayFieldDeserialization started
[info] Test test.org.apache.spark.sql.JavaBeanDeserializationSuite.testSpark22000FailToUpcast started
[info] Test test.org.apache.spark.sql.JavaBeanDeserializationSuite.testSpark22000 started
[info] Test test.org.apache.spark.sql.JavaBeanDeserializationSuite.testBeanWithLocalDateAndInstant started
[info] Test test.org.apache.spark.sql.JavaBeanDeserializationSuite.testBeanWithMapFieldsDeserialization started
[info] Test run finished: 0 failed, 0 ignored, 5 total, 0.735s
[info] Test run started
[info] Test test.org.apache.spark.sql.execution.sort.RecordBinaryComparatorSuite.testBinaryComparatorForSingleColumnRow started
[info] Test test.org.apache.spark.sql.execution.sort.RecordBinaryComparatorSuite.testBinaryComparatorForArrayColumn started
[info] Test test.org.apache.spark.sql.execution.sort.RecordBinaryComparatorSuite.testBinaryComparatorWhenOnlyTheLastColumnDiffers started
[info] Test test.org.apache.spark.sql.execution.sort.RecordBinaryComparatorSuite.testBinaryComparatorForMixedColumns started
[info] Test test.org.apache.spark.sql.execution.sort.RecordBinaryComparatorSuite.testBinaryComparatorForNullColumns started
[info] Test test.org.apache.spark.sql.execution.sort.RecordBinaryComparatorSuite.testBinaryComparatorForMultipleColumnRow started
[info] Test test.org.apache.spark.sql.execution.sort.RecordBinaryComparatorSuite.testBinaryComparatorWhenSubtractionIsDivisibleByMaxIntValue started
[info] Test test.org.apache.spark.sql.execution.sort.RecordBinaryComparatorSuite.testBinaryComparatorWhenSubtractionCanOverflowLongValue started
[info] Test run finished: 0 failed, 0 ignored, 8 total, 0.003s
[info] Test run started
[info] Test test.org.apache.spark.sql.JavaUDFSuite.udf1Test started
[info] Test test.org.apache.spark.sql.JavaUDFSuite.udf2Test started
[info] Test test.org.apache.spark.sql.JavaUDFSuite.udf3Test started
[info] Test test.org.apache.spark.sql.JavaUDFSuite.udf4Test started
[info] Test test.org.apache.spark.sql.JavaUDFSuite.udf5Test started
[info] Test test.org.apache.spark.sql.JavaUDFSuite.udf6Test started
[info] Test test.org.apache.spark.sql.JavaUDFSuite.udf7Test started
[info] Test run finished: 0 failed, 0 ignored, 7 total, 1.293s
[info] Test run started
[info] Test test.org.apache.spark.sql.JavaColumnExpressionSuite.isInCollectionCheckExceptionMessage started
[info] Test test.org.apache.spark.sql.JavaColumnExpressionSuite.isInCollectionWorksCorrectlyOnJava started
[info] Test run finished: 0 failed, 0 ignored, 2 total, 0.283s
[info] ScalaTest
[info] Run completed in 50 minutes, 33 seconds.
[info] Total number of tests run: 5738
[info] Suites: completed 286, aborted 0
[info] Tests: succeeded 5737, failed 1, canceled 0, ignored 25, pending 0
[info] *** 1 TEST FAILED ***
[error] Failed: Total 5844, Failed 1, Errors 0, Passed 5843, Ignored 25
[error] Failed tests:
[error] 	org.apache.spark.sql.SQLQueryTestSuite
[error] (sql/test:test) sbt.TestsFailedException: Tests unsuccessful
[error] Total time: 3045 s, completed Jul 10, 2019 10:00:24 AM
[error] running /home/jenkins/workspace/NewSparkPullRequestBuilder/build/sbt -Phadoop-2.7 -Phive-thriftserver -Phive -Dtest.exclude.tags=org.apache.spark.tags.ExtendedHiveTest,org.apache.spark.tags.ExtendedYarnTest hive-thriftserver/test avro/test mllib/test hive/test repl/test catalyst/test sql/test sql-kafka-0-10/test examples/test ; received return code 1
Attempting to post to Github...
 > Post successful.
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Finished: FAILURE