FailedConsole Output

Skipping 22,760 KB.. Full Log
WJ6SCJEgYhn6zEskT9nMS8dP3gkqLMvHTriiIGKaihyfl5xfk5qXrOEBpkDgMEMDIxMFQUlDCw2+gXFyTm2QEAI9P8iI4AAAA=
[info] - 2.3: getPartitions(db: String, table: String) (44 milliseconds)
[info] - 2.3: loadPartition (84 milliseconds)
[info] - 2.3: loadDynamicPartitions (61 milliseconds)
[info] - 2.3: renamePartitions (256 milliseconds)
[info] - 2.3: alterPartitions (126 milliseconds)
[info] - 2.3: dropPartitions (183 milliseconds)
[info] - 2.3: createFunction (23 milliseconds)
[info] - 2.3: functionExists (14 milliseconds)
[info] - 2.3: renameFunction (13 milliseconds)
[info] - 2.3: alterFunction (8 milliseconds)
[info] - 2.3: getFunction (2 milliseconds)
[info] - 2.3: getFunctionOption (16 milliseconds)
[info] - 2.3: listFunctions (10 milliseconds)
[info] - 2.3: dropFunction (20 milliseconds)
[info] - 2.3: sql set command (16 milliseconds)
[info] - 2.3: sql create index and reset (1 second, 167 milliseconds)
18:28:38.659 WARN org.apache.hadoop.hive.ql.Driver: Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases.
18:28:38.955 WARN org.apache.hadoop.mapreduce.JobResourceUploader: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
18:28:40.872 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
18:28:40.873 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
18:28:41.001 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
18:28:41.001 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
18:28:41.002 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
18:28:41.040 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
18:28:41.047 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database global_temp, returning NoSuchObjectException
[info] - 2.3: sql read hive materialized view (2 seconds, 949 milliseconds)
[info] - 2.3: version (0 milliseconds)
[info] - 2.3: getConf (0 milliseconds)
[info] - 2.3: setOut (1 millisecond)
[info] - 2.3: setInfo (1 millisecond)
[info] - 2.3: setError (1 millisecond)
[info] - 2.3: newSession (189 milliseconds)
[info] - 2.3: withHiveState and addJar (7 milliseconds)
18:28:41.881 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/NewSparkPullRequestBuilder/target/tmp/spark-88f09d92-7173-4667-b6c1-99bdd8884e5b does not exist; Force to delete it.
18:28:41.881 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/NewSparkPullRequestBuilder/target/tmp/spark-88f09d92-7173-4667-b6c1-99bdd8884e5b
18:28:41.882 WARN org.apache.hadoop.hive.metastore.txn.TxnHandler: Cannot perform cleanup since metastore table does not exist
18:28:41.889 WARN org.apache.hadoop.hive.metastore.txn.TxnHandler: Cannot perform cleanup since metastore table does not exist
[info] - 2.3: reset (611 milliseconds)
18:28:41.922 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/NewSparkPullRequestBuilder/target/tmp/spark-2a321266-4858-4645-afdb-b98cde87e81e/tbl specified for non-external table:tbl
18:28:42.319 WARN org.apache.spark.scheduler.DAGScheduler: Broadcasting large task binary with size 233.4 KiB
[info] - 2.3: CREATE TABLE AS SELECT (1 second, 60 milliseconds)
18:28:42.972 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/NewSparkPullRequestBuilder/target/tmp/spark-2a321266-4858-4645-afdb-b98cde87e81e/tbl specified for non-external table:tbl
18:28:43.398 WARN org.apache.spark.scheduler.DAGScheduler: Broadcasting large task binary with size 233.8 KiB
[info] - 2.3: CREATE Partitioned TABLE AS SELECT (1 second, 782 milliseconds)
18:28:45.018 WARN org.apache.spark.scheduler.DAGScheduler: Broadcasting large task binary with size 233.6 KiB
18:28:45.629 WARN org.apache.spark.scheduler.DAGScheduler: Broadcasting large task binary with size 233.6 KiB
18:28:46.273 WARN org.apache.spark.scheduler.DAGScheduler: Broadcasting large task binary with size 233.6 KiB
[info] - 2.3: Delete the temporary staging directory and files after each insert (2 seconds, 20 milliseconds)
18:28:47.129 WARN org.apache.spark.scheduler.DAGScheduler: Broadcasting large task binary with size 237.1 KiB
[info] - 2.3: SPARK-13709: reading partitioned Avro table with nested schema (1 second, 325 milliseconds)
18:28:48.245 WARN org.apache.spark.scheduler.DAGScheduler: Broadcasting large task binary with size 160.1 KiB
18:28:48.595 WARN org.apache.spark.scheduler.DAGScheduler: Broadcasting large task binary with size 156.7 KiB
[info] - 2.3: CTAS for managed data source tables (860 milliseconds)
18:28:49.009 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/NewSparkPullRequestBuilder/target/tmp/spark-2a321266-4858-4645-afdb-b98cde87e81e/tab1 specified for non-external table:tab1
18:28:49.380 WARN org.apache.spark.scheduler.DAGScheduler: Broadcasting large task binary with size 236.1 KiB
18:28:50.448 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/NewSparkPullRequestBuilder/target/tmp/spark-2a321266-4858-4645-afdb-b98cde87e81e/tab1 specified for non-external table:tab1
18:28:50.792 WARN org.apache.spark.scheduler.DAGScheduler: Broadcasting large task binary with size 235.8 KiB
[info] - 2.3: Decimal support of Avro Hive serde (2 seconds, 358 milliseconds)
[info] - 2.3: read avro file containing decimal (210 milliseconds)
18:28:51.581 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/NewSparkPullRequestBuilder/target/tmp/spark-2a321266-4858-4645-afdb-b98cde87e81e/tab1 specified for non-external table:tab1
18:28:51.982 WARN org.apache.spark.scheduler.DAGScheduler: Broadcasting large task binary with size 236.8 KiB
18:28:52.787 WARN org.apache.spark.scheduler.DAGScheduler: Broadcasting large task binary with size 236.8 KiB
[info] - 2.3: SPARK-17920: Insert into/overwrite avro table (1 second, 967 milliseconds)
Hive Session ID = 7cd47ecb-45be-436f-b3e3-a7050fd1ea76
18:28:56.589 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
18:28:56.589 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
18:28:56.590 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
18:28:56.609 WARN org.apache.hadoop.hive.ql.session.SessionState: METASTORE_FILTER_HOOK will be ignored, since hive.security.authorization.manager is set to instance of HiveAuthorizerFactory.
[info] - 3.1: create client (3 seconds, 136 milliseconds)
18:28:57.381 WARN org.apache.hadoop.hive.metastore.ObjectStore: datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored
18:28:57.914 WARN com.zaxxer.hikari.util.DriverDataSource: Registered driver with driverClassName=org.apache.derby.jdbc.EmbeddedDriver was not found, trying direct instantiation.
18:28:59.031 WARN com.zaxxer.hikari.util.DriverDataSource: Registered driver with driverClassName=org.apache.derby.jdbc.EmbeddedDriver was not found, trying direct instantiation.
18:28:59.861 WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
18:28:59.861 WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
18:28:59.862 WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
18:28:59.862 WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
18:28:59.863 WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
18:28:59.863 WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
18:29:00.390 WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
18:29:00.390 WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
18:29:00.391 WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
18:29:00.391 WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
18:29:00.391 WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
18:29:00.391 WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
18:29:03.241 WARN org.apache.hadoop.hive.metastore.ObjectStore: Version information not found in metastore. metastore.schema.verification is not enabled so recording the schema version 3.1.0
18:29:03.241 WARN org.apache.hadoop.hive.metastore.ObjectStore: setMetaStoreSchemaVersion called but recording version is disabled: version = 3.1.0, comment = Set by MetaStore jenkins@192.168.10.26
18:29:03.411 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database hive.default, returning NoSuchObjectException
18:29:03.876 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database hive.temporary, returning NoSuchObjectException
[info] - 3.1: createDatabase (7 seconds, 254 milliseconds)
18:29:03.904 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database hive.dbWithNullDesc, returning NoSuchObjectException
[info] - 3.1: createDatabase with null description (46 milliseconds)
[info] - 3.1: setCurrentDatabase (2 milliseconds)
18:29:03.961 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database hive.nonexist, returning NoSuchObjectException
[info] - 3.1: getDatabase (4 milliseconds)
18:29:03.967 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database hive.nonexist, returning NoSuchObjectException
[info] - 3.1: databaseExists (4 milliseconds)
[info] - 3.1: listDatabases (19 milliseconds)
[info] - 3.1: alterDatabase (331 milliseconds)
18:29:04.621 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database hive.temporary, returning NoSuchObjectException
[info] - 3.1: dropDatabase (296 milliseconds)
18:29:04.713 WARN org.apache.hadoop.hive.ql.session.SessionState: METASTORE_FILTER_HOOK will be ignored, since hive.security.authorization.manager is set to instance of HiveAuthorizerFactory.
18:29:04.722 WARN org.apache.hadoop.hive.metastore.ObjectStore: datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored
18:29:04.809 WARN org.apache.hadoop.hive.metastore.ObjectStore: datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored
[info] - 3.1: createTable (409 milliseconds)
[info] - 3.1: loadTable (509 milliseconds)
[info] - 3.1: tableExists (14 milliseconds)
[info] - 3.1: getTable (25 milliseconds)
[info] - 3.1: getTableOption (11 milliseconds)
[info] - 3.1: alterTable(table: CatalogTable) (51 milliseconds)
[info] - 3.1: alterTable(dbName: String, tableName: String, table: CatalogTable) (58 milliseconds)
[info] - 3.1: alterTable - rename (83 milliseconds)
18:29:05.808 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database hive.temporary, returning NoSuchObjectException
[info] - 3.1: alterTable - change database (75 milliseconds)
[info] - 3.1: alterTable - change database and table names (54 milliseconds)
[info] - 3.1: listTables(database) (12 milliseconds)
[info] - 3.1: listTables(database, pattern) (23 milliseconds)
[info] - 3.1: dropTable (436 milliseconds)
[info] - 3.1: sql create partitioned table (32 milliseconds)
[info] - 3.1: createPartitions (111 milliseconds)
[info] - 3.1: getPartitionNames(catalogTable) (39 milliseconds)
[info] - 3.1: getPartitions(catalogTable) (87 milliseconds)
[info] - 3.1: getPartitionsByFilter (172 milliseconds)
[info] - 3.1: getPartition (44 milliseconds)
[info] - 3.1: getPartitionOption(db: String, table: String, spec: TablePartitionSpec) (32 milliseconds)
[info] - 3.1: getPartitionOption(table: CatalogTable, spec: TablePartitionSpec) (32 milliseconds)
[info] - 3.1: getPartitions(db: String, table: String) (46 milliseconds)
18:29:07.098 WARN org.apache.hadoop.hive.metastore.utils.MetaStoreUtils: Updating partition stats fast for: src_part
18:29:07.099 WARN org.apache.hadoop.hive.metastore.utils.MetaStoreUtils: Updated size to 0
[info] - 3.1: loadPartition (81 milliseconds)
[info] - 3.1: loadDynamicPartitions (24 milliseconds)
18:29:07.208 WARN org.apache.hadoop.hive.metastore.utils.MetaStoreUtils: Updating partition stats fast for: src_part
18:29:07.208 WARN org.apache.hadoop.hive.metastore.utils.MetaStoreUtils: Updated size to 0
[info] - 3.1: renamePartitions (133 milliseconds)
18:29:07.313 WARN org.apache.hadoop.hive.metastore.utils.MetaStoreUtils: Updating partition stats fast for: src_part
18:29:07.313 WARN org.apache.hadoop.hive.metastore.utils.MetaStoreUtils: Updated size to 0
[info] - 3.1: alterPartitions (123 milliseconds)
[info] - 3.1: dropPartitions (207 milliseconds)
[info] - 3.1: createFunction (37 milliseconds)
[info] - 3.1: functionExists (15 milliseconds)
[info] - 3.1: renameFunction (15 milliseconds)
[info] - 3.1: alterFunction (7 milliseconds)
[info] - 3.1: getFunction (2 milliseconds)
[info] - 3.1: getFunctionOption (20 milliseconds)
[info] - 3.1: listFunctions (9 milliseconds)
[info] - 3.1: dropFunction (13 milliseconds)
[info] - 3.1: sql set command (17 milliseconds)
[info] - 3.1: sql create index and reset (0 milliseconds)
[info] - 3.1: sql read hive materialized view (0 milliseconds)
[info] - 3.1: version (0 milliseconds)
[info] - 3.1: getConf (0 milliseconds)
[info] - 3.1: setOut (1 millisecond)
[info] - 3.1: setInfo (1 millisecond)
[info] - 3.1: setError (1 millisecond)
[info] - 3.1: newSession (196 milliseconds)
[info] - 3.1: withHiveState and addJar (10 milliseconds)
18:29:08.367 WARN org.apache.hadoop.hive.metastore.utils.FileUtils: File file:/home/jenkins/workspace/NewSparkPullRequestBuilder/target/tmp/spark-bad67bf0-b003-495d-ab49-390f1183efa7 does not exist; Force to delete it.
18:29:08.367 ERROR org.apache.hadoop.hive.metastore.utils.FileUtils: Failed to delete file:/home/jenkins/workspace/NewSparkPullRequestBuilder/target/tmp/spark-bad67bf0-b003-495d-ab49-390f1183efa7
[info] - 3.1: reset (408 milliseconds)
18:29:08.461 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/NewSparkPullRequestBuilder/target/tmp/spark-8db41a89-4012-4c4a-920f-5d1d3aa01654/tbl specified for non-external table:tbl
18:29:08.824 WARN org.apache.spark.scheduler.DAGScheduler: Broadcasting large task binary with size 233.5 KiB
18:29:09.099 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database hive.global_temp, returning NoSuchObjectException
[info] - 3.1: CREATE TABLE AS SELECT (930 milliseconds)
18:29:09.374 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/NewSparkPullRequestBuilder/target/tmp/spark-8db41a89-4012-4c4a-920f-5d1d3aa01654/tbl specified for non-external table:tbl
18:29:09.651 WARN org.apache.spark.scheduler.DAGScheduler: Broadcasting large task binary with size 234.0 KiB
[info] - 3.1: CREATE Partitioned TABLE AS SELECT (1 second, 306 milliseconds)
18:29:10.943 WARN org.apache.spark.scheduler.DAGScheduler: Broadcasting large task binary with size 233.7 KiB
18:29:11.437 WARN org.apache.spark.scheduler.DAGScheduler: Broadcasting large task binary with size 233.7 KiB
18:29:11.988 WARN org.apache.spark.scheduler.DAGScheduler: Broadcasting large task binary with size 233.7 KiB
[info] - 3.1: Delete the temporary staging directory and files after each insert (1 second, 750 milliseconds)
18:29:12.818 WARN org.apache.spark.scheduler.DAGScheduler: Broadcasting large task binary with size 237.2 KiB
18:29:13.160 WARN org.apache.hadoop.hive.metastore.utils.MetaStoreUtils: Updating partition stats fast for: spark_13709
18:29:13.162 WARN org.apache.hadoop.hive.metastore.utils.MetaStoreUtils: Updated size to 264
[info] - 3.1: SPARK-13709: reading partitioned Avro table with nested schema (1 second, 127 milliseconds)
18:29:13.714 WARN org.apache.spark.scheduler.DAGScheduler: Broadcasting large task binary with size 160.1 KiB
18:29:14.084 WARN org.apache.spark.scheduler.DAGScheduler: Broadcasting large task binary with size 156.7 KiB
[info] - 3.1: CTAS for managed data source tables (912 milliseconds)
18:29:14.517 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/NewSparkPullRequestBuilder/target/tmp/spark-8db41a89-4012-4c4a-920f-5d1d3aa01654/tab1 specified for non-external table:tab1
18:29:14.759 WARN org.apache.spark.scheduler.DAGScheduler: Broadcasting large task binary with size 236.3 KiB
18:29:15.350 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/NewSparkPullRequestBuilder/target/tmp/spark-8db41a89-4012-4c4a-920f-5d1d3aa01654/tab1 specified for non-external table:tab1
18:29:15.680 WARN org.apache.spark.scheduler.DAGScheduler: Broadcasting large task binary with size 235.9 KiB
[info] - 3.1: Decimal support of Avro Hive serde (1 second, 630 milliseconds)
[info] - 3.1: read avro file containing decimal (213 milliseconds)
18:29:16.372 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/NewSparkPullRequestBuilder/target/tmp/spark-8db41a89-4012-4c4a-920f-5d1d3aa01654/tab1 specified for non-external table:tab1
18:29:16.628 WARN org.apache.spark.scheduler.DAGScheduler: Broadcasting large task binary with size 236.9 KiB
18:29:17.429 WARN org.apache.spark.scheduler.DAGScheduler: Broadcasting large task binary with size 236.9 KiB
[info] - 3.1: SPARK-17920: Insert into/overwrite avro table (1 second, 677 milliseconds)
[info] Test run started
[info] Test org.apache.spark.sql.hive.JavaDataFrameSuite.testUDAF started
[info] Test org.apache.spark.sql.hive.JavaDataFrameSuite.saveTableAndQueryIt started
[info] Test run finished: 0 failed, 0 ignored, 2 total, 2.4s
[info] Test run started
[info] Test org.apache.spark.sql.hive.JavaMetastoreDataSourcesSuite.saveExternalTableAndQueryIt started
18:29:20.695 WARN org.apache.spark.scheduler.DAGScheduler: Broadcasting large task binary with size 165.4 KiB
18:29:20.873 WARN org.apache.spark.sql.hive.test.TestHiveExternalCatalog: Couldn't find corresponding Hive SerDe for data source provider org.apache.spark.sql.json. Persisting data source table `default`.`javasavedtable` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
18:29:21.209 WARN org.apache.spark.sql.hive.test.TestHiveExternalCatalog: Couldn't find corresponding Hive SerDe for data source provider org.apache.spark.sql.json. Persisting data source table `default`.`externaltable` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
[info] Test org.apache.spark.sql.hive.JavaMetastoreDataSourcesSuite.saveTableAndQueryIt started
18:29:21.770 WARN org.apache.spark.scheduler.DAGScheduler: Broadcasting large task binary with size 165.4 KiB
18:29:21.953 WARN org.apache.spark.sql.hive.test.TestHiveExternalCatalog: Couldn't find corresponding Hive SerDe for data source provider org.apache.spark.sql.json. Persisting data source table `default`.`javasavedtable` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
[info] Test org.apache.spark.sql.hive.JavaMetastoreDataSourcesSuite.saveExternalTableWithSchemaAndQueryIt started
18:29:22.386 WARN org.apache.spark.scheduler.DAGScheduler: Broadcasting large task binary with size 165.4 KiB
18:29:22.563 WARN org.apache.spark.sql.hive.test.TestHiveExternalCatalog: Couldn't find corresponding Hive SerDe for data source provider org.apache.spark.sql.json. Persisting data source table `default`.`javasavedtable` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
18:29:22.826 WARN org.apache.spark.sql.hive.test.TestHiveExternalCatalog: Couldn't find corresponding Hive SerDe for data source provider org.apache.spark.sql.json. Persisting data source table `default`.`externaltable` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
[info] Test run finished: 0 failed, 0 ignored, 3 total, 2.794s
[info] ScalaTest
[info] Run completed in 4 hours, 33 minutes, 38 seconds.
[info] Total number of tests run: 0
[info] Suites: completed 0, aborted 0
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
[info] No tests were executed.
[info] ScalaTest
[info] Run completed in 4 hours, 33 minutes, 24 seconds.
[info] Total number of tests run: 0
[info] Suites: completed 0, aborted 0
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
[info] No tests were executed.
[info] ScalaTest
[info] Run completed in 4 hours, 33 minutes, 37 seconds.
[info] Total number of tests run: 0
[info] Suites: completed 0, aborted 0
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
[info] No tests were executed.
[info] ScalaTest
[info] Run completed in 4 hours, 33 minutes, 41 seconds.
[info] Total number of tests run: 0
[info] Suites: completed 0, aborted 0
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
[info] No tests were executed.
[info] ScalaTest
[info] Run completed in 4 hours, 33 minutes, 38 seconds.
[info] Total number of tests run: 0
[info] Suites: completed 0, aborted 0
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
[info] No tests were executed.
[info] ScalaTest
[info] Run completed in 4 hours, 33 minutes, 24 seconds.
[info] Total number of tests run: 0
[info] Suites: completed 0, aborted 0
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
[info] No tests were executed.
[info] ScalaTest
[info] Run completed in 4 hours, 33 minutes, 41 seconds.
[info] Total number of tests run: 0
[info] Suites: completed 0, aborted 0
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
[info] No tests were executed.
[info] Passed: Total 44, Failed 0, Errors 0, Passed 44
[info] ScalaTest
[info] Run completed in 4 hours, 33 minutes, 41 seconds.
[info] Total number of tests run: 0
[info] Suites: completed 0, aborted 0
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
[info] No tests were executed.
[info] Passed: Total 50, Failed 0, Errors 0, Passed 50
[info] ScalaTest
[info] Run completed in 4 hours, 33 minutes, 41 seconds.
[info] Total number of tests run: 0
[info] Suites: completed 0, aborted 0
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
[info] No tests were executed.
[info] Passed: Total 81, Failed 0, Errors 0, Passed 81
[info] ScalaTest
[info] Run completed in 4 hours, 33 minutes, 41 seconds.
[info] Total number of tests run: 0
[info] Suites: completed 0, aborted 0
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
[info] No tests were executed.
[info] Passed: Total 102, Failed 0, Errors 0, Passed 101, Skipped 1
[info] ScalaTest
[info] Run completed in 4 hours, 33 minutes, 41 seconds.
[info] Total number of tests run: 19
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 19, failed 0, canceled 0, ignored 1, pending 0
[info] All tests passed.
[info] Passed: Total 80, Failed 0, Errors 0, Passed 80, Ignored 1
[info] ScalaTest
[info] Run completed in 4 hours, 33 minutes, 41 seconds.
[info] Total number of tests run: 29
[info] Suites: completed 3, aborted 0
[info] Tests: succeeded 29, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[info] Passed: Total 29, Failed 0, Errors 0, Passed 29
[info] ScalaTest
[info] Run completed in 4 hours, 33 minutes, 35 seconds.
[info] Total number of tests run: 2235
[info] Suites: completed 221, aborted 0
[info] Tests: succeeded 2235, failed 0, canceled 8, ignored 7, pending 0
[info] All tests passed.
[info] Passed: Total 2484, Failed 0, Errors 0, Passed 2484, Ignored 7, Canceled 8
[info] ScalaTest
[info] Run completed in 4 hours, 33 minutes, 41 seconds.
[info] Total number of tests run: 85
[info] Suites: completed 8, aborted 0
[info] Tests: succeeded 85, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[info] Passed: Total 85, Failed 0, Errors 0, Passed 85
[info] ScalaTest
[info] Run completed in 4 hours, 33 minutes, 35 seconds.
[info] Total number of tests run: 21
[info] Suites: completed 3, aborted 0
[info] Tests: succeeded 21, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[info] Passed: Total 21, Failed 0, Errors 0, Passed 21
[info] ScalaTest
[info] Run completed in 4 hours, 33 minutes, 35 seconds.
[info] Total number of tests run: 104
[info] Suites: completed 19, aborted 0
[info] Tests: succeeded 104, failed 0, canceled 6, ignored 0, pending 0
[info] All tests passed.
[info] Passed: Total 104, Failed 0, Errors 0, Passed 104, Canceled 6
[info] ScalaTest
[info] Run completed in 4 hours, 33 minutes, 35 seconds.
[info] Total number of tests run: 20
[info] Suites: completed 3, aborted 0
[info] Tests: succeeded 20, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[info] Passed: Total 24, Failed 0, Errors 0, Passed 24
[info] ScalaTest
[info] Run completed in 4 hours, 33 minutes, 35 seconds.
[info] Total number of tests run: 104
[info] Suites: completed 10, aborted 0
[info] Tests: succeeded 104, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[info] Passed: Total 104, Failed 0, Errors 0, Passed 104
[info] ScalaTest
[info] Run completed in 4 hours, 33 minutes, 35 seconds.
[info] Total number of tests run: 93
[info] Suites: completed 25, aborted 0
[info] Tests: succeeded 93, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[info] Passed: Total 93, Failed 0, Errors 0, Passed 93
[info] ScalaTest
[info] Run completed in 4 hours, 33 minutes, 35 seconds.
[info] Total number of tests run: 103
[info] Suites: completed 17, aborted 0
[info] Tests: succeeded 103, failed 0, canceled 18, ignored 0, pending 0
[info] All tests passed.
[info] Passed: Total 103, Failed 0, Errors 0, Passed 103, Canceled 18
[info] ScalaTest
[info] Run completed in 4 hours, 33 minutes, 35 seconds.
[info] Total number of tests run: 335
[info] Suites: completed 40, aborted 0
[info] Tests: succeeded 335, failed 0, canceled 0, ignored 1, pending 0
[info] All tests passed.
[info] Passed: Total 435, Failed 0, Errors 0, Passed 435, Ignored 1
[info] ScalaTest
[info] Run completed in 4 hours, 33 minutes, 26 seconds.
[info] Total number of tests run: 36
[info] Suites: completed 3, aborted 0
[info] Tests: succeeded 36, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[info] Passed: Total 36, Failed 0, Errors 0, Passed 36
[info] ScalaTest
[info] Run completed in 4 hours, 33 minutes, 24 seconds.
[info] Total number of tests run: 3310
[info] Suites: completed 200, aborted 0
[info] Tests: succeeded 3310, failed 0, canceled 0, ignored 2, pending 0
[info] All tests passed.
[info] Passed: Total 3341, Failed 0, Errors 0, Passed 3341, Ignored 2
[info] ScalaTest
[info] Run completed in 4 hours, 33 minutes, 34 seconds.
[info] Total number of tests run: 59
[info] Suites: completed 8, aborted 0
[info] Tests: succeeded 59, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[info] Passed: Total 66, Failed 0, Errors 0, Passed 66
[info] ScalaTest
[info] Run completed in 4 hours, 33 minutes, 18 seconds.
[info] Total number of tests run: 5041
[info] Suites: completed 271, aborted 0
[info] Tests: succeeded 5041, failed 0, canceled 0, ignored 24, pending 0
[info] All tests passed.
[info] Passed: Total 5145, Failed 0, Errors 0, Passed 5145, Ignored 24
[info] ScalaTest
[info] Run completed in 4 hours, 33 minutes, 18 seconds.
[info] Total number of tests run: 45
[info] Suites: completed 10, aborted 0
[info] Tests: succeeded 45, failed 0, canceled 0, ignored 2, pending 0
[info] All tests passed.
[info] Passed: Total 45, Failed 0, Errors 0, Passed 45, Ignored 2
[info] ScalaTest
[info] Run completed in 4 hours, 33 minutes, 18 seconds.
[info] Total number of tests run: 98
[info] Suites: completed 4, aborted 0
[info] Tests: succeeded 98, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[info] Passed: Total 99, Failed 0, Errors 0, Passed 99
[info] ScalaTest
[info] Run completed in 4 hours, 33 minutes, 18 seconds.
[info] Total number of tests run: 171
[info] Suites: completed 16, aborted 0
[info] Tests: succeeded 171, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[info] Passed: Total 171, Failed 0, Errors 0, Passed 171
[info] ScalaTest
[info] Run completed in 4 hours, 33 minutes, 15 seconds.
[info] Total number of tests run: 1436
[info] Suites: completed 193, aborted 0
[info] Tests: succeeded 1436, failed 0, canceled 0, ignored 7, pending 0
[info] All tests passed.
[info] Passed: Total 1558, Failed 0, Errors 0, Passed 1558, Ignored 7
[info] ScalaTest
[info] Run completed in 4 hours, 33 minutes, 15 seconds.
[info] Total number of tests run: 3188
[info] Suites: completed 100, aborted 0
[info] Tests: succeeded 3188, failed 0, canceled 0, ignored 592, pending 0
[info] All tests passed.
[info] Passed: Total 3193, Failed 0, Errors 0, Passed 3193, Ignored 592
[success] Total time: 16429 s, completed Jul 13, 2019 6:30:14 PM

========================================================================
Running PySpark tests
========================================================================
Running PySpark tests. Output is in /home/jenkins/workspace/NewSparkPullRequestBuilder/python/unit-tests.log
Will test against the following Python executables: ['python2.7', 'pypy']
Will test the following Python modules: ['pyspark-core', 'pyspark-ml', 'pyspark-mllib', 'pyspark-sql', 'pyspark-streaming']
Starting test(pypy): pyspark.sql.tests.test_appsubmit
Starting test(pypy): pyspark.sql.tests.test_arrow
Starting test(pypy): pyspark.sql.tests.test_catalog
Starting test(pypy): pyspark.sql.tests.test_context
Starting test(pypy): pyspark.sql.tests.test_column
Starting test(pypy): pyspark.sql.tests.test_conf
Starting test(pypy): pyspark.sql.tests.test_dataframe
Starting test(pypy): pyspark.sql.tests.test_datasources
Finished test(pypy): pyspark.sql.tests.test_arrow (0s) ... 46 tests were skipped
Starting test(pypy): pyspark.sql.tests.test_functions
Finished test(pypy): pyspark.sql.tests.test_conf (9s)
Starting test(pypy): pyspark.sql.tests.test_group
Finished test(pypy): pyspark.sql.tests.test_column (20s)
Starting test(pypy): pyspark.sql.tests.test_pandas_udf
Finished test(pypy): pyspark.sql.tests.test_catalog (20s)
Starting test(pypy): pyspark.sql.tests.test_pandas_udf_grouped_agg
Finished test(pypy): pyspark.sql.tests.test_pandas_udf (0s) ... 6 tests were skipped
Starting test(pypy): pyspark.sql.tests.test_pandas_udf_grouped_map
Finished test(pypy): pyspark.sql.tests.test_pandas_udf_grouped_agg (0s) ... 13 tests were skipped
Starting test(pypy): pyspark.sql.tests.test_pandas_udf_scalar
Finished test(pypy): pyspark.sql.tests.test_pandas_udf_grouped_map (0s) ... 17 tests were skipped
Starting test(pypy): pyspark.sql.tests.test_pandas_udf_window
Finished test(pypy): pyspark.sql.tests.test_pandas_udf_scalar (0s) ... 45 tests were skipped
Starting test(pypy): pyspark.sql.tests.test_readwriter
Finished test(pypy): pyspark.sql.tests.test_pandas_udf_window (0s) ... 14 tests were skipped
Starting test(pypy): pyspark.sql.tests.test_serde
Finished test(pypy): pyspark.sql.tests.test_datasources (25s)
Starting test(pypy): pyspark.sql.tests.test_session
Finished test(pypy): pyspark.sql.tests.test_group (19s)
Starting test(pypy): pyspark.sql.tests.test_streaming
Finished test(pypy): pyspark.sql.tests.test_functions (28s)
Starting test(pypy): pyspark.sql.tests.test_types
Finished test(pypy): pyspark.sql.tests.test_dataframe (38s) ... 5 tests were skipped
Starting test(pypy): pyspark.sql.tests.test_udf
Finished test(pypy): pyspark.sql.tests.test_serde (21s)
Starting test(pypy): pyspark.sql.tests.test_utils
Build timed out (after 300 minutes). Marking the build as failed.
Build was aborted
Archiving artifacts
Finished test(pypy): pyspark.sql.tests.test_readwriter (29s)
Starting test(pypy): pyspark.streaming.tests.test_context
Recording test results
Finished: FAILURE