FailedConsole Output

Skipping 11,706 KB.. Full Log
yAwAsE8VZpQAAAA==- 1.2: getPartitionsByFilter (128 milliseconds)
[info] - 1.2: getPartition (61 milliseconds)
[info] - 1.2: getPartitionOption(db: String, table: String, spec: TablePartitionSpec) (26 milliseconds)
[info] - 1.2: getPartitionOption(table: CatalogTable, spec: TablePartitionSpec) (26 milliseconds)
[info] - 1.2: getPartitions(db: String, table: String) (17 milliseconds)
[info] - 1.2: loadPartition (143 milliseconds)
[info] - 1.2: loadDynamicPartitions (18 milliseconds)
[info] - 1.2: renamePartitions (191 milliseconds)
[info] - 1.2: alterPartitions (125 milliseconds)
[info] - 1.2: dropPartitions (198 milliseconds)
[info] - 1.2: createFunction (19 milliseconds)
[info] - 1.2: functionExists (13 milliseconds)
[info] - 1.2: renameFunction (11 milliseconds)
[info] - 1.2: alterFunction (6 milliseconds)
[info] - 1.2: getFunction (2 milliseconds)
[info] - 1.2: getFunctionOption (12 milliseconds)
[info] - 1.2: listFunctions (9 milliseconds)
[info] - 1.2: dropFunction (16 milliseconds)
[info] - 1.2: sql set command (4 milliseconds)
[info] - 1.2: sql create index and reset (224 milliseconds)
[info] - 1.2: version (0 milliseconds)
[info] - 1.2: getConf (0 milliseconds)
[info] - 1.2: setOut (0 milliseconds)
[info] - 1.2: setInfo (1 millisecond)
[info] - 1.2: setError (0 milliseconds)
[info] - 1.2: newSession (98 milliseconds)
[info] - 1.2: withHiveState and addJar (11 milliseconds)
[info] - 1.2: reset (521 milliseconds)
19:02:33.428 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.2/target/tmp/warehouse-d2c6f265-1677-44e6-a335-120d64b4221b/tbl specified for non-external table:tbl
[info] - 1.2: CREATE TABLE AS SELECT (369 milliseconds)
[info] - 1.2: Delete the temporary staging directory and files after each insert (660 milliseconds)
[info] MultiDatabaseSuite:
19:02:34.538 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db_d4d2e103_3943_45e0_aa20_ee1bd51bca9a, returning NoSuchObjectException
[info] - saveAsTable() to non-default database - with USE - Overwrite (568 milliseconds)
19:02:35.106 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db_9c6ab6e2_ae94_4ae6_ae3f_4c0783b0ed33, returning NoSuchObjectException
[info] - saveAsTable() to non-default database - without USE - Overwrite (239 milliseconds)
19:02:35.346 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db_a88527e4_c5ad_4570_93b7_27949ecb9343, returning NoSuchObjectException
[info] - createExternalTable() to non-default database - with USE (546 milliseconds)
19:02:35.891 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db_195c3ad6_2a23_4570_9a5a_7a5e199f3be8, returning NoSuchObjectException
[info] - createExternalTable() to non-default database - without USE (595 milliseconds)
19:02:36.486 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db_911cbd12_36a0_4a69_a1a1_45275886a5c8, returning NoSuchObjectException
[info] - saveAsTable() to non-default database - with USE - Append (476 milliseconds)
19:02:36.963 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db_00c9ceb0_1563_4684_ab31_576ec8b012c1, returning NoSuchObjectException
[info] - saveAsTable() to non-default database - without USE - Append (406 milliseconds)
19:02:37.369 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db_d21c99ed_568e_420c_94ff_48eb5e4d62a7, returning NoSuchObjectException
[info] - insertInto() non-default database - with USE (359 milliseconds)
19:02:37.729 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db_fd87280a_27a2_4cd0_9742_05696ec03d9e, returning NoSuchObjectException
[info] - insertInto() non-default database - without USE (432 milliseconds)
19:02:38.161 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db_4aa15797_6714_4371_8c8c_053882495516, returning NoSuchObjectException
19:02:38.190 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.2/target/tmp/warehouse-d2c6f265-1677-44e6-a335-120d64b4221b/db_4aa15797_6714_4371_8c8c_053882495516.db/t specified for non-external table:t
[info] - Looks up tables in non-default database (152 milliseconds)
19:02:38.313 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db_4ea49f7d_e4fa_4dc9_866f_3c7a19b421d9, returning NoSuchObjectException
19:02:38.345 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.2/target/tmp/warehouse-d2c6f265-1677-44e6-a335-120d64b4221b/db_4ea49f7d_e4fa_4dc9_866f_3c7a19b421d9.db/t specified for non-external table:t
[info] - Drops a table in a non-default database (289 milliseconds)
19:02:38.603 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db_620bfe7f_88aa_42b5_93fa_8ee580f8ef8d, returning NoSuchObjectException
[info] - Refreshes a table in a non-default database - with USE (1 second, 595 milliseconds)
19:02:40.199 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database db_16c3e2d5_e2af_49de_852b_84e48ea29d30, returning NoSuchObjectException
[info] - Refreshes a table in a non-default database - without USE (1 second, 357 milliseconds)
19:02:41.556 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database d:b, returning NoSuchObjectException
19:02:41.559 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database d:b, returning NoSuchObjectException
19:02:41.567 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database d:b, returning NoSuchObjectException
[info] - invalid database name and table names (14 milliseconds)
[info] ConcurrentHiveSuite:
[info] - multiple instances not supported !!! IGNORED !!!
[info] HiveClientSuite:
19:02:46.642 WARN org.apache.hadoop.hive.metastore.ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
19:02:49.372 WARN org.apache.spark.sql.hive.client.Shim_v1_2: Caught Hive MetaException attempting to get partition metadata by filter from Hive. Falling back to fetching all partition metadata, which will degrade performance. Modifying your Hive metastore configuration to set hive.metastore.try.direct.sql to true may resolve this problem.
java.lang.reflect.InvocationTargetException
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:497)
	at org.apache.spark.sql.hive.client.Shim_v0_13.getPartitionsByFilter(HiveShim.scala:614)
	at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$getPartitionsByFilter$1.apply(HiveClientImpl.scala:574)
	at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$getPartitionsByFilter$1.apply(HiveClientImpl.scala:572)
	at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply(HiveClientImpl.scala:279)
	at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:226)
	at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:225)
	at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:268)
	at org.apache.spark.sql.hive.client.HiveClientImpl.getPartitionsByFilter(HiveClientImpl.scala:572)
	at org.apache.spark.sql.hive.client.HiveClientSuite$$anonfun$1.apply$mcV$sp(HiveClientSuite.scala:56)
	at org.apache.spark.sql.hive.client.HiveClientSuite$$anonfun$1.apply(HiveClientSuite.scala:34)
	at org.apache.spark.sql.hive.client.HiveClientSuite$$anonfun$1.apply(HiveClientSuite.scala:34)
	at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
	at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
	at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
	at org.scalatest.Transformer.apply(Transformer.scala:22)
	at org.scalatest.Transformer.apply(Transformer.scala:20)
	at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:166)
	at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:68)
	at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:163)
	at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:175)
	at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:175)
	at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
	at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:175)
	at org.scalatest.FunSuite.runTest(FunSuite.scala:1555)
	at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:208)
	at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:208)
	at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:413)
	at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
	at scala.collection.immutable.List.foreach(List.scala:381)
	at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
	at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:396)
	at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:483)
	at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:208)
	at org.scalatest.FunSuite.runTests(FunSuite.scala:1555)
	at org.scalatest.Suite$class.run(Suite.scala:1424)
	at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1555)
	at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:212)
	at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:212)
	at org.scalatest.SuperEngine.runImpl(Engine.scala:545)
	at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:212)
	at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:31)
	at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:257)
	at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:256)
	at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:31)
	at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:357)
	at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:502)
	at sbt.ForkMain$Run$2.call(ForkMain.java:296)
	at sbt.ForkMain$Run$2.call(ForkMain.java:286)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: MetaException(message:Filtering is supported only on partition keys of type string)
	at org.apache.hadoop.hive.metastore.parser.ExpressionTree$FilterBuilder.setError(ExpressionTree.java:185)
	at org.apache.hadoop.hive.metastore.parser.ExpressionTree$LeafNode.getJdoFilterPushdownParam(ExpressionTree.java:440)
	at org.apache.hadoop.hive.metastore.parser.ExpressionTree$LeafNode.generateJDOFilterOverPartitions(ExpressionTree.java:357)
	at org.apache.hadoop.hive.metastore.parser.ExpressionTree$LeafNode.generateJDOFilter(ExpressionTree.java:279)
	at org.apache.hadoop.hive.metastore.parser.ExpressionTree.generateJDOFilterFragment(ExpressionTree.java:578)
	at org.apache.hadoop.hive.metastore.ObjectStore.makeQueryFilterString(ObjectStore.java:2615)
	at org.apache.hadoop.hive.metastore.ObjectStore.getPartitionsViaOrmFilter(ObjectStore.java:2199)
	at org.apache.hadoop.hive.metastore.ObjectStore.access$500(ObjectStore.java:160)
	at org.apache.hadoop.hive.metastore.ObjectStore$5.getJdoResult(ObjectStore.java:2530)
	at org.apache.hadoop.hive.metastore.ObjectStore$5.getJdoResult(ObjectStore.java:2515)
	at org.apache.hadoop.hive.metastore.ObjectStore$GetHelper.run(ObjectStore.java:2391)
	at org.apache.hadoop.hive.metastore.ObjectStore.getPartitionsByFilterInternal(ObjectStore.java:2515)
	at org.apache.hadoop.hive.metastore.ObjectStore.getPartitionsByFilter(ObjectStore.java:2335)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:497)
	at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:114)
	at com.sun.proxy.$Proxy94.getPartitionsByFilter(Unknown Source)
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_partitions_by_filter(HiveMetaStore.java:4442)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:497)
	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107)
	at com.sun.proxy.$Proxy96.get_partitions_by_filter(Unknown Source)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.listPartitionsByFilter(HiveMetaStoreClient.java:1103)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:497)
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:156)
	at com.sun.proxy.$Proxy97.listPartitionsByFilter(Unknown Source)
	at org.apache.hadoop.hive.ql.metadata.Hive.getPartitionsByFilter(Hive.java:2254)
	... 56 more
[info] - getPartitionsByFilter returns all partitions when hive.metastore.try.direct.sql=false (7 seconds, 859 milliseconds)
[info] StatisticsSuite:
19:02:49.834 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.2/target/tmp/warehouse-d2c6f265-1677-44e6-a335-120d64b4221b/hive_tbl specified for non-external table:hive_tbl
[info] - SPARK-18856: non-empty partitioned table should not report zero size (869 milliseconds)
19:02:50.379 WARN org.apache.spark.sql.hive.HiveExternalCatalog: The table schema given by Hive metastore(struct<page_id:string,impressions:string>) is different from the schema when this table was created by Spark SQL(struct<page_id:int,impressions:int>). We have to fall back to the table schema from Hive metastore which is not case preserving.
19:02:50.389 WARN org.apache.spark.sql.hive.HiveExternalCatalog: The table schema given by Hive metastore(struct<page_id:string,impressions:string>) is different from the schema when this table was created by Spark SQL(struct<page_id:int,impressions:int>). We have to fall back to the table schema from Hive metastore which is not case preserving.
19:02:50.392 WARN org.apache.spark.sql.hive.HiveExternalCatalog: The table schema given by Hive metastore(struct<page_id:string,impressions:string>) is different from the schema when this table was created by Spark SQL(struct<page_id:int,impressions:int>). We have to fall back to the table schema from Hive metastore which is not case preserving.
[info] - MetastoreRelations fallback to HDFS for size estimation (57 milliseconds)
19:02:50.410 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.2/target/tmp/warehouse-d2c6f265-1677-44e6-a335-120d64b4221b/analyzetable specified for non-external table:analyzetable
19:02:50.462 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.2/target/tmp/warehouse-d2c6f265-1677-44e6-a335-120d64b4221b/src specified for non-external table:src
19:02:51.028 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.2/target/tmp/warehouse-d2c6f265-1677-44e6-a335-120d64b4221b/analyzetable_part specified for non-external table:analyzetable_part
[info] - analyze MetastoreRelations (1 second, 556 milliseconds)
[info] - analyzing views is not supported (286 milliseconds)
19:02:52.254 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.2/target/tmp/warehouse-d2c6f265-1677-44e6-a335-120d64b4221b/texttable specified for non-external table:texttable
[info] - test table-level statistics for hive tables created in HiveExternalCatalog (448 milliseconds)
19:02:52.703 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.2/target/tmp/warehouse-d2c6f265-1677-44e6-a335-120d64b4221b/texttable specified for non-external table:texttable
[info] - test elimination of the influences of the old stats (576 milliseconds)
19:02:53.280 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.2/target/tmp/warehouse-d2c6f265-1677-44e6-a335-120d64b4221b/parquettable specified for non-external table:parquettable
19:02:53.320 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.2/target/tmp/warehouse-d2c6f265-1677-44e6-a335-120d64b4221b/orctable specified for non-external table:orctable
[info] - test statistics of LogicalRelation converted from MetastoreRelation (620 milliseconds)
[info] - verify serialized column stats after analyzing columns (1 second, 218 milliseconds)
[info] - test table-level statistics for data source table created in HiveExternalCatalog (529 milliseconds)
[info] - test table-level statistics for partitioned data source table (869 milliseconds)
19:02:56.586 WARN org.apache.spark.sql.hive.HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data source provider json. Persisting data source table `default`.`table_no_cols` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
[info] - statistics collection of a table with zero column (230 milliseconds)
[info] - test refreshing table stats of cached data source table by `ANALYZE TABLE` statement (363 milliseconds)
[info] - estimates the size of a test MetastoreRelation (17 milliseconds)
[info] - auto converts to broadcast hash join, by size estimate of a relation (283 milliseconds)
[info] - auto converts to broadcast left semi join, by size estimate of a relation (245 milliseconds)
[info] HiveDataFrameJoinSuite:
[info] - join - self join auto resolve ambiguity with case insensitivity (233 milliseconds)
[info] CommitFailureTestRelationSuite:
19:02:58.088 ERROR org.apache.spark.util.Utils: Aborting task
java.lang.RuntimeException: Intentional task commitment failure for testing purpose.
	at scala.sys.package$.error(package.scala:27)
	at org.apache.spark.sql.sources.CommitFailureTestSource$$anon$1$$anon$2.close(CommitFailureTestSource.scala:62)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$SingleDirectoryWriteTask.releaseResources(FileFormatWriter.scala:252)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:191)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:188)
	at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1356)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:193)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:129)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:128)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:100)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:325)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
19:02:58.093 ERROR org.apache.spark.sql.execution.datasources.FileFormatWriter: Job job_20180827190258_14653 aborted.
19:02:58.095 WARN org.apache.spark.util.Utils: Suppressing exception in catch: Intentional task commitment failure for testing purpose.
java.lang.RuntimeException: Intentional task commitment failure for testing purpose.
	at scala.sys.package$.error(package.scala:27)
	at org.apache.spark.sql.sources.CommitFailureTestSource$$anon$1$$anon$2.close(CommitFailureTestSource.scala:62)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$SingleDirectoryWriteTask.releaseResources(FileFormatWriter.scala:252)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$1.apply$mcV$sp(FileFormatWriter.scala:196)
	at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1365)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:193)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:129)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:128)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:100)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:325)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
19:02:58.096 ERROR org.apache.spark.executor.Executor: Exception in task 0.0 in stage 14653.0 (TID 35356)
org.apache.spark.SparkException: Task failed while writing rows
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:204)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:129)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:128)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:100)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:325)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: Intentional task commitment failure for testing purpose.
	at scala.sys.package$.error(package.scala:27)
	at org.apache.spark.sql.sources.CommitFailureTestSource$$anon$1$$anon$2.close(CommitFailureTestSource.scala:62)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$SingleDirectoryWriteTask.releaseResources(FileFormatWriter.scala:252)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:191)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:188)
	at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1356)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:193)
	... 8 more
	Suppressed: java.lang.RuntimeException: Intentional task commitment failure for testing purpose.
		at scala.sys.package$.error(package.scala:27)
		at org.apache.spark.sql.sources.CommitFailureTestSource$$anon$1$$anon$2.close(CommitFailureTestSource.scala:62)
		at org.apache.spark.sql.execution.datasources.FileFormatWriter$SingleDirectoryWriteTask.releaseResources(FileFormatWriter.scala:252)
		at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$1.apply$mcV$sp(FileFormatWriter.scala:196)
		at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1365)
		... 9 more
19:02:58.099 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 0.0 in stage 14653.0 (TID 35356, localhost, executor driver): org.apache.spark.SparkException: Task failed while writing rows
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:204)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:129)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:128)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:100)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:325)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: Intentional task commitment failure for testing purpose.
	at scala.sys.package$.error(package.scala:27)
	at org.apache.spark.sql.sources.CommitFailureTestSource$$anon$1$$anon$2.close(CommitFailureTestSource.scala:62)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$SingleDirectoryWriteTask.releaseResources(FileFormatWriter.scala:252)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:191)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:188)
	at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1356)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:193)
	... 8 more
	Suppressed: java.lang.RuntimeException: Intentional task commitment failure for testing purpose.
		at scala.sys.package$.error(package.scala:27)
		at org.apache.spark.sql.sources.CommitFailureTestSource$$anon$1$$anon$2.close(CommitFailureTestSource.scala:62)
		at org.apache.spark.sql.execution.datasources.FileFormatWriter$SingleDirectoryWriteTask.releaseResources(FileFormatWriter.scala:252)
		at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$1.apply$mcV$sp(FileFormatWriter.scala:196)
		at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1365)
		... 9 more

19:02:58.099 ERROR org.apache.spark.scheduler.TaskSetManager: Task 0 in stage 14653.0 failed 1 times; aborting job
19:02:58.101 ERROR org.apache.spark.sql.execution.datasources.FileFormatWriter: Aborting job null.
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 14653.0 failed 1 times, most recent failure: Lost task 0.0 in stage 14653.0 (TID 35356, localhost, executor driver): org.apache.spark.SparkException: Task failed while writing rows
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:204)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:129)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:128)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:100)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:325)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: Intentional task commitment failure for testing purpose.
	at scala.sys.package$.error(package.scala:27)
	at org.apache.spark.sql.sources.CommitFailureTestSource$$anon$1$$anon$2.close(CommitFailureTestSource.scala:62)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$SingleDirectoryWriteTask.releaseResources(FileFormatWriter.scala:252)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:191)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:188)
	at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1356)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:193)
	... 8 more
	Suppressed: java.lang.RuntimeException: Intentional task commitment failure for testing purpose.
		at scala.sys.package$.error(package.scala:27)
		at org.apache.spark.sql.sources.CommitFailureTestSource$$anon$1$$anon$2.close(CommitFailureTestSource.scala:62)
		at org.apache.spark.sql.execution.datasources.FileFormatWriter$SingleDirectoryWriteTask.releaseResources(FileFormatWriter.scala:252)
		at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$1.apply$mcV$sp(FileFormatWriter.scala:196)
		at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1365)
		... 9 more

Driver stacktrace:
	at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1455)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1443)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1442)
	at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
	at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1442)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:802)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:802)
	at scala.Option.foreach(Option.scala:257)
	at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:802)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1670)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1625)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1614)
	at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
	at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:628)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:1928)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:1941)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:1961)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1.apply$mcV$sp(FileFormatWriter.scala:127)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1.apply(FileFormatWriter.scala:121)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1.apply(FileFormatWriter.scala:121)
	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:57)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.write(FileFormatWriter.scala:121)
	at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand.run(InsertIntoHadoopFsRelationCommand.scala:101)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:74)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132)
	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:113)
	at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:92)
	at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:92)
	at org.apache.spark.sql.execution.datasources.DataSource.writeInFileFormat(DataSource.scala:484)
	at org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:520)
	at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:215)
	at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:198)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$1.apply$mcV$sp(CommitFailureTestRelationSuite.scala:39)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$1.apply(CommitFailureTestRelationSuite.scala:39)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$1.apply(CommitFailureTestRelationSuite.scala:39)
	at org.scalatest.Assertions$class.intercept(Assertions.scala:997)
	at org.scalatest.FunSuite.intercept(FunSuite.scala:1555)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(CommitFailureTestRelationSuite.scala:38)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(CommitFailureTestRelationSuite.scala:33)
	at org.apache.spark.sql.test.SQLTestUtils$class.withTempPath(SQLTestUtils.scala:124)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite.withTempPath(CommitFailureTestRelationSuite.scala:28)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$1.apply$mcV$sp(CommitFailureTestRelationSuite.scala:33)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$1.apply(CommitFailureTestRelationSuite.scala:33)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$1.apply(CommitFailureTestRelationSuite.scala:33)
	at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
	at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
	at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
	at org.scalatest.Transformer.apply(Transformer.scala:22)
	at org.scalatest.Transformer.apply(Transformer.scala:20)
	at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:166)
	at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:68)
	at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:163)
	at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:175)
	at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:175)
	at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
	at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:175)
	at org.scalatest.FunSuite.runTest(FunSuite.scala:1555)
	at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:208)
	at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:208)
	at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:413)
	at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
	at scala.collection.immutable.List.foreach(List.scala:381)
	at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
	at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:396)
	at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:483)
	at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:208)
	at org.scalatest.FunSuite.runTests(FunSuite.scala:1555)
	at org.scalatest.Suite$class.run(Suite.scala:1424)
	at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1555)
	at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:212)
	at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:212)
	at org.scalatest.SuperEngine.runImpl(Engine.scala:545)
	at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:212)
	at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:31)
	at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:257)
	at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:256)
	at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:31)
	at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:357)
	at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:502)
	at sbt.ForkMain$Run$2.call(ForkMain.java:296)
	at sbt.ForkMain$Run$2.call(ForkMain.java:286)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.spark.SparkException: Task failed while writing rows
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:204)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:129)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:128)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:100)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:325)
	... 3 more
Caused by: java.lang.RuntimeException: Intentional task commitment failure for testing purpose.
	at scala.sys.package$.error(package.scala:27)
	at org.apache.spark.sql.sources.CommitFailureTestSource$$anon$1$$anon$2.close(CommitFailureTestSource.scala:62)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$SingleDirectoryWriteTask.releaseResources(FileFormatWriter.scala:252)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:191)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:188)
	at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1356)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:193)
	... 8 more
	Suppressed: java.lang.RuntimeException: Intentional task commitment failure for testing purpose.
		at scala.sys.package$.error(package.scala:27)
		at org.apache.spark.sql.sources.CommitFailureTestSource$$anon$1$$anon$2.close(CommitFailureTestSource.scala:62)
		at org.apache.spark.sql.execution.datasources.FileFormatWriter$SingleDirectoryWriteTask.releaseResources(FileFormatWriter.scala:252)
		at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$1.apply$mcV$sp(FileFormatWriter.scala:196)
		at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1365)
		... 9 more
[info] - SPARK-7684: commitTask() failure should fallback to abortTask() (96 milliseconds)
19:02:58.206 ERROR org.apache.spark.util.Utils: Aborting task
org.apache.spark.SparkException: Failed to execute user defined function($anonfun$3: (int) => int)
	at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(generated.java:50)
	at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
	at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$8$$anon$1.hasNext(WholeStageCodegenExec.scala:377)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$SingleDirectoryWriteTask.execute(FileFormatWriter.scala:243)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:190)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:188)
	at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1356)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:193)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:129)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:128)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:100)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:325)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ArithmeticException: / by zero
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$2$$anonfun$apply$mcV$sp$2$$anonfun$3.apply$mcII$sp(CommitFailureTestRelationSuite.scala:51)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$2$$anonfun$apply$mcV$sp$2$$anonfun$3.apply(CommitFailureTestRelationSuite.scala:51)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$2$$anonfun$apply$mcV$sp$2$$anonfun$3.apply(CommitFailureTestRelationSuite.scala:51)
	at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(generated.java:48)
	... 15 more
19:02:58.207 ERROR org.apache.spark.sql.execution.datasources.FileFormatWriter: Job job_20180827190258_14654 aborted.
19:02:58.207 WARN org.apache.spark.util.Utils: Suppressing exception in catch: Intentional task commitment failure for testing purpose.
java.lang.RuntimeException: Intentional task commitment failure for testing purpose.
	at scala.sys.package$.error(package.scala:27)
	at org.apache.spark.sql.sources.CommitFailureTestSource$$anon$1$$anon$2.close(CommitFailureTestSource.scala:62)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$SingleDirectoryWriteTask.releaseResources(FileFormatWriter.scala:252)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$1.apply$mcV$sp(FileFormatWriter.scala:196)
	at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1365)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:193)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:129)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:128)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:100)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:325)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
19:02:58.207 ERROR org.apache.spark.executor.Executor: Exception in task 0.0 in stage 14654.0 (TID 35357)
org.apache.spark.SparkException: Task failed while writing rows
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:204)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:129)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:128)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:100)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:325)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.spark.SparkException: Failed to execute user defined function($anonfun$3: (int) => int)
	at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(generated.java:50)
	at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
	at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$8$$anon$1.hasNext(WholeStageCodegenExec.scala:377)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$SingleDirectoryWriteTask.execute(FileFormatWriter.scala:243)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:190)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:188)
	at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1356)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:193)
	... 8 more
	Suppressed: java.lang.RuntimeException: Intentional task commitment failure for testing purpose.
		at scala.sys.package$.error(package.scala:27)
		at org.apache.spark.sql.sources.CommitFailureTestSource$$anon$1$$anon$2.close(CommitFailureTestSource.scala:62)
		at org.apache.spark.sql.execution.datasources.FileFormatWriter$SingleDirectoryWriteTask.releaseResources(FileFormatWriter.scala:252)
		at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$1.apply$mcV$sp(FileFormatWriter.scala:196)
		at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1365)
		... 9 more
Caused by: java.lang.ArithmeticException: / by zero
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$2$$anonfun$apply$mcV$sp$2$$anonfun$3.apply$mcII$sp(CommitFailureTestRelationSuite.scala:51)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$2$$anonfun$apply$mcV$sp$2$$anonfun$3.apply(CommitFailureTestRelationSuite.scala:51)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$2$$anonfun$apply$mcV$sp$2$$anonfun$3.apply(CommitFailureTestRelationSuite.scala:51)
	at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(generated.java:48)
	... 15 more
19:02:58.210 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 0.0 in stage 14654.0 (TID 35357, localhost, executor driver): org.apache.spark.SparkException: Task failed while writing rows
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:204)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:129)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:128)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:100)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:325)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.spark.SparkException: Failed to execute user defined function($anonfun$3: (int) => int)
	at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(generated.java:50)
	at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
	at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$8$$anon$1.hasNext(WholeStageCodegenExec.scala:377)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$SingleDirectoryWriteTask.execute(FileFormatWriter.scala:243)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:190)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:188)
	at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1356)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:193)
	... 8 more
	Suppressed: java.lang.RuntimeException: Intentional task commitment failure for testing purpose.
		at scala.sys.package$.error(package.scala:27)
		at org.apache.spark.sql.sources.CommitFailureTestSource$$anon$1$$anon$2.close(CommitFailureTestSource.scala:62)
		at org.apache.spark.sql.execution.datasources.FileFormatWriter$SingleDirectoryWriteTask.releaseResources(FileFormatWriter.scala:252)
		at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$1.apply$mcV$sp(FileFormatWriter.scala:196)
		at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1365)
		... 9 more
Caused by: java.lang.ArithmeticException: / by zero
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$2$$anonfun$apply$mcV$sp$2$$anonfun$3.apply$mcII$sp(CommitFailureTestRelationSuite.scala:51)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$2$$anonfun$apply$mcV$sp$2$$anonfun$3.apply(CommitFailureTestRelationSuite.scala:51)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$2$$anonfun$apply$mcV$sp$2$$anonfun$3.apply(CommitFailureTestRelationSuite.scala:51)
	at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(generated.java:48)
	... 15 more

19:02:58.210 ERROR org.apache.spark.scheduler.TaskSetManager: Task 0 in stage 14654.0 failed 1 times; aborting job
19:02:58.212 ERROR org.apache.spark.sql.execution.datasources.FileFormatWriter: Aborting job null.
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 14654.0 failed 1 times, most recent failure: Lost task 0.0 in stage 14654.0 (TID 35357, localhost, executor driver): org.apache.spark.SparkException: Task failed while writing rows
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:204)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:129)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:128)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:100)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:325)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.spark.SparkException: Failed to execute user defined function($anonfun$3: (int) => int)
	at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(generated.java:50)
	at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
	at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$8$$anon$1.hasNext(WholeStageCodegenExec.scala:377)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$SingleDirectoryWriteTask.execute(FileFormatWriter.scala:243)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:190)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:188)
	at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1356)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:193)
	... 8 more
	Suppressed: java.lang.RuntimeException: Intentional task commitment failure for testing purpose.
		at scala.sys.package$.error(package.scala:27)
		at org.apache.spark.sql.sources.CommitFailureTestSource$$anon$1$$anon$2.close(CommitFailureTestSource.scala:62)
		at org.apache.spark.sql.execution.datasources.FileFormatWriter$SingleDirectoryWriteTask.releaseResources(FileFormatWriter.scala:252)
		at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$1.apply$mcV$sp(FileFormatWriter.scala:196)
		at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1365)
		... 9 more
Caused by: java.lang.ArithmeticException: / by zero
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$2$$anonfun$apply$mcV$sp$2$$anonfun$3.apply$mcII$sp(CommitFailureTestRelationSuite.scala:51)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$2$$anonfun$apply$mcV$sp$2$$anonfun$3.apply(CommitFailureTestRelationSuite.scala:51)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$2$$anonfun$apply$mcV$sp$2$$anonfun$3.apply(CommitFailureTestRelationSuite.scala:51)
	at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(generated.java:48)
	... 15 more

Driver stacktrace:
	at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1455)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1443)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1442)
	at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
	at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1442)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:802)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:802)
	at scala.Option.foreach(Option.scala:257)
	at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:802)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1670)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1625)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1614)
	at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
	at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:628)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:1928)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:1941)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:1961)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1.apply$mcV$sp(FileFormatWriter.scala:127)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1.apply(FileFormatWriter.scala:121)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1.apply(FileFormatWriter.scala:121)
	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:57)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.write(FileFormatWriter.scala:121)
	at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand.run(InsertIntoHadoopFsRelationCommand.scala:101)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:74)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132)
	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:113)
	at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:92)
	at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:92)
	at org.apache.spark.sql.execution.datasources.DataSource.writeInFileFormat(DataSource.scala:484)
	at org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:520)
	at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:215)
	at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:198)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$2$$anonfun$apply$mcV$sp$2$$anonfun$apply$2.apply$mcV$sp(CommitFailureTestRelationSuite.scala:56)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$2$$anonfun$apply$mcV$sp$2$$anonfun$apply$2.apply(CommitFailureTestRelationSuite.scala:56)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$2$$anonfun$apply$mcV$sp$2$$anonfun$apply$2.apply(CommitFailureTestRelationSuite.scala:56)
	at org.scalatest.Assertions$class.intercept(Assertions.scala:997)
	at org.scalatest.FunSuite.intercept(FunSuite.scala:1555)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$2$$anonfun$apply$mcV$sp$2.apply(CommitFailureTestRelationSuite.scala:55)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$2$$anonfun$apply$mcV$sp$2.apply(CommitFailureTestRelationSuite.scala:49)
	at org.apache.spark.sql.test.SQLTestUtils$class.withTempPath(SQLTestUtils.scala:124)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite.withTempPath(CommitFailureTestRelationSuite.scala:28)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$2.apply$mcV$sp(CommitFailureTestRelationSuite.scala:49)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$2.apply(CommitFailureTestRelationSuite.scala:47)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$2.apply(CommitFailureTestRelationSuite.scala:47)
	at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
	at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
	at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
	at org.scalatest.Transformer.apply(Transformer.scala:22)
	at org.scalatest.Transformer.apply(Transformer.scala:20)
	at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:166)
	at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:68)
	at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:163)
	at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:175)
	at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:175)
	at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
	at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:175)
	at org.scalatest.FunSuite.runTest(FunSuite.scala:1555)
	at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:208)
	at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:208)
	at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:413)
	at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
	at scala.collection.immutable.List.foreach(List.scala:381)
	at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
	at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:396)
	at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:483)
	at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:208)
	at org.scalatest.FunSuite.runTests(FunSuite.scala:1555)
	at org.scalatest.Suite$class.run(Suite.scala:1424)
	at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1555)
	at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:212)
	at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:212)
	at org.scalatest.SuperEngine.runImpl(Engine.scala:545)
	at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:212)
	at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:31)
	at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:257)
	at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:256)
	at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:31)
	at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:357)
	at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:502)
	at sbt.ForkMain$Run$2.call(ForkMain.java:296)
	at sbt.ForkMain$Run$2.call(ForkMain.java:286)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.spark.SparkException: Task failed while writing rows
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:204)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:129)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:128)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:100)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:325)
	... 3 more
Caused by: org.apache.spark.SparkException: Failed to execute user defined function($anonfun$3: (int) => int)
	at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(generated.java:50)
	at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
	at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$8$$anon$1.hasNext(WholeStageCodegenExec.scala:377)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$SingleDirectoryWriteTask.execute(FileFormatWriter.scala:243)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:190)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:188)
	at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1356)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:193)
	... 8 more
	Suppressed: java.lang.RuntimeException: Intentional task commitment failure for testing purpose.
		at scala.sys.package$.error(package.scala:27)
		at org.apache.spark.sql.sources.CommitFailureTestSource$$anon$1$$anon$2.close(CommitFailureTestSource.scala:62)
		at org.apache.spark.sql.execution.datasources.FileFormatWriter$SingleDirectoryWriteTask.releaseResources(FileFormatWriter.scala:252)
		at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$1.apply$mcV$sp(FileFormatWriter.scala:196)
		at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1365)
		... 9 more
Caused by: java.lang.ArithmeticException: / by zero
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$2$$anonfun$apply$mcV$sp$2$$anonfun$3.apply$mcII$sp(CommitFailureTestRelationSuite.scala:51)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$2$$anonfun$apply$mcV$sp$2$$anonfun$3.apply(CommitFailureTestRelationSuite.scala:51)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$2$$anonfun$apply$mcV$sp$2$$anonfun$3.apply(CommitFailureTestRelationSuite.scala:51)
	at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(generated.java:48)
	... 15 more
[info] - call failure callbacks before close writer - default (110 milliseconds)
19:02:58.347 ERROR org.apache.spark.util.Utils: Aborting task
java.lang.RuntimeException: Intentional task writer failure for testing purpose.
	at scala.sys.package$.error(package.scala:27)
	at org.apache.spark.sql.sources.CommitFailureTestSource$$anon$1$$anon$2.write(CommitFailureTestSource.scala:54)
	at org.apache.spark.sql.execution.datasources.OutputWriter.writeInternal(OutputWriter.scala:93)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$DynamicPartitionWriteTask.execute(FileFormatWriter.scala:397)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:190)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:188)
	at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1356)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:193)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:129)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:128)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:100)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:325)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
19:02:58.348 ERROR org.apache.spark.sql.execution.datasources.FileFormatWriter: Job job_20180827190258_14655 aborted.
19:02:58.348 WARN org.apache.spark.util.Utils: Suppressing exception in catch: Intentional task commitment failure for testing purpose.
java.lang.RuntimeException: Intentional task commitment failure for testing purpose.
	at scala.sys.package$.error(package.scala:27)
	at org.apache.spark.sql.sources.CommitFailureTestSource$$anon$1$$anon$2.close(CommitFailureTestSource.scala:62)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$DynamicPartitionWriteTask.releaseResources(FileFormatWriter.scala:408)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$1.apply$mcV$sp(FileFormatWriter.scala:196)
	at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1365)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:193)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:129)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:128)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:100)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:325)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
19:02:58.357 ERROR org.apache.spark.executor.Executor: Exception in task 0.0 in stage 14655.0 (TID 35358)
org.apache.spark.SparkException: Task failed while writing rows
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:204)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:129)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:128)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:100)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:325)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: Intentional task writer failure for testing purpose.
	at scala.sys.package$.error(package.scala:27)
	at org.apache.spark.sql.sources.CommitFailureTestSource$$anon$1$$anon$2.write(CommitFailureTestSource.scala:54)
	at org.apache.spark.sql.execution.datasources.OutputWriter.writeInternal(OutputWriter.scala:93)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$DynamicPartitionWriteTask.execute(FileFormatWriter.scala:397)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:190)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:188)
	at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1356)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:193)
	... 8 more
	Suppressed: java.lang.RuntimeException: Intentional task commitment failure for testing purpose.
		at scala.sys.package$.error(package.scala:27)
		at org.apache.spark.sql.sources.CommitFailureTestSource$$anon$1$$anon$2.close(CommitFailureTestSource.scala:62)
		at org.apache.spark.sql.execution.datasources.FileFormatWriter$DynamicPartitionWriteTask.releaseResources(FileFormatWriter.scala:408)
		at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$1.apply$mcV$sp(FileFormatWriter.scala:196)
		at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1365)
		... 9 more
19:02:58.358 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 0.0 in stage 14655.0 (TID 35358, localhost, executor driver): org.apache.spark.SparkException: Task failed while writing rows
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:204)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:129)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:128)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:100)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:325)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: Intentional task writer failure for testing purpose.
	at scala.sys.package$.error(package.scala:27)
	at org.apache.spark.sql.sources.CommitFailureTestSource$$anon$1$$anon$2.write(CommitFailureTestSource.scala:54)
	at org.apache.spark.sql.execution.datasources.OutputWriter.writeInternal(OutputWriter.scala:93)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$DynamicPartitionWriteTask.execute(FileFormatWriter.scala:397)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:190)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:188)
	at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1356)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:193)
	... 8 more
	Suppressed: java.lang.RuntimeException: Intentional task commitment failure for testing purpose.
		at scala.sys.package$.error(package.scala:27)
		at org.apache.spark.sql.sources.CommitFailureTestSource$$anon$1$$anon$2.close(CommitFailureTestSource.scala:62)
		at org.apache.spark.sql.execution.datasources.FileFormatWriter$DynamicPartitionWriteTask.releaseResources(FileFormatWriter.scala:408)
		at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$1.apply$mcV$sp(FileFormatWriter.scala:196)
		at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1365)
		... 9 more

19:02:58.358 ERROR org.apache.spark.scheduler.TaskSetManager: Task 0 in stage 14655.0 failed 1 times; aborting job
19:02:58.360 ERROR org.apache.spark.sql.execution.datasources.FileFormatWriter: Aborting job null.
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 14655.0 failed 1 times, most recent failure: Lost task 0.0 in stage 14655.0 (TID 35358, localhost, executor driver): org.apache.spark.SparkException: Task failed while writing rows
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:204)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:129)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:128)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:100)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:325)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: Intentional task writer failure for testing purpose.
	at scala.sys.package$.error(package.scala:27)
	at org.apache.spark.sql.sources.CommitFailureTestSource$$anon$1$$anon$2.write(CommitFailureTestSource.scala:54)
	at org.apache.spark.sql.execution.datasources.OutputWriter.writeInternal(OutputWriter.scala:93)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$DynamicPartitionWriteTask.execute(FileFormatWriter.scala:397)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:190)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:188)
	at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1356)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:193)
	... 8 more
	Suppressed: java.lang.RuntimeException: Intentional task commitment failure for testing purpose.
		at scala.sys.package$.error(package.scala:27)
		at org.apache.spark.sql.sources.CommitFailureTestSource$$anon$1$$anon$2.close(CommitFailureTestSource.scala:62)
		at org.apache.spark.sql.execution.datasources.FileFormatWriter$DynamicPartitionWriteTask.releaseResources(FileFormatWriter.scala:408)
		at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$1.apply$mcV$sp(FileFormatWriter.scala:196)
		at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1365)
		... 9 more

Driver stacktrace:
	at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1455)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1443)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1442)
	at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
	at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1442)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:802)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:802)
	at scala.Option.foreach(Option.scala:257)
	at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:802)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1670)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1625)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1614)
	at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
	at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:628)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:1928)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:1941)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:1961)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1.apply$mcV$sp(FileFormatWriter.scala:127)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1.apply(FileFormatWriter.scala:121)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1.apply(FileFormatWriter.scala:121)
	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:57)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.write(FileFormatWriter.scala:121)
	at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand.run(InsertIntoHadoopFsRelationCommand.scala:101)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:74)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132)
	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:113)
	at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:92)
	at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:92)
	at org.apache.spark.sql.execution.datasources.DataSource.writeInFileFormat(DataSource.scala:484)
	at org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:520)
	at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:215)
	at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:198)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$4$$anonfun$apply$mcV$sp$3$$anonfun$apply$3.apply$mcV$sp(CommitFailureTestRelationSuite.scala:74)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$4$$anonfun$apply$mcV$sp$3$$anonfun$apply$3.apply(CommitFailureTestRelationSuite.scala:74)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$4$$anonfun$apply$mcV$sp$3$$anonfun$apply$3.apply(CommitFailureTestRelationSuite.scala:74)
	at org.scalatest.Assertions$class.intercept(Assertions.scala:997)
	at org.scalatest.FunSuite.intercept(FunSuite.scala:1555)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$4$$anonfun$apply$mcV$sp$3.apply(CommitFailureTestRelationSuite.scala:73)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$4$$anonfun$apply$mcV$sp$3.apply(CommitFailureTestRelationSuite.scala:67)
	at org.apache.spark.sql.test.SQLTestUtils$class.withTempPath(SQLTestUtils.scala:124)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite.withTempPath(CommitFailureTestRelationSuite.scala:28)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$4.apply$mcV$sp(CommitFailureTestRelationSuite.scala:67)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$4.apply(CommitFailureTestRelationSuite.scala:65)
	at org.apache.spark.sql.sources.CommitFailureTestRelationSuite$$anonfun$4.apply(CommitFailureTestRelationSuite.scala:65)
	at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
	at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
	at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
	at org.scalatest.Transformer.apply(Transformer.scala:22)
	at org.scalatest.Transformer.apply(Transformer.scala:20)
	at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:166)
	at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:68)
	at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:163)
	at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:175)
	at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:175)
	at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
	at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:175)
	at org.scalatest.FunSuite.runTest(FunSuite.scala:1555)
	at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:208)
	at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:208)
	at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:413)
	at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
	at scala.collection.immutable.List.foreach(List.scala:381)
	at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
	at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:396)
	at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:483)
	at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:208)
	at org.scalatest.FunSuite.runTests(FunSuite.scala:1555)
	at org.scalatest.Suite$class.run(Suite.scala:1424)
	at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1555)
	at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:212)
	at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:212)
	at org.scalatest.SuperEngine.runImpl(Engine.scala:545)
	at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:212)
	at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:31)
	at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:257)
	at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:256)
	at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:31)
	at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:357)
	at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:502)
	at sbt.ForkMain$Run$2.call(ForkMain.java:296)
	at sbt.ForkMain$Run$2.call(ForkMain.java:286)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.spark.SparkException: Task failed while writing rows
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:204)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:129)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1$$anonfun$3.apply(FileFormatWriter.scala:128)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:100)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:325)
	... 3 more
Caused by: java.lang.RuntimeException: Intentional task writer failure for testing purpose.
	at scala.sys.package$.error(package.scala:27)
	at org.apache.spark.sql.sources.CommitFailureTestSource$$anon$1$$anon$2.write(CommitFailureTestSource.scala:54)
	at org.apache.spark.sql.execution.datasources.OutputWriter.writeInternal(OutputWriter.scala:93)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$DynamicPartitionWriteTask.execute(FileFormatWriter.scala:397)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:190)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:188)
	at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1356)
	at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:193)
	... 8 more
	Suppressed: java.lang.RuntimeException: Intentional task commitment failure for testing purpose.
		at scala.sys.package$.error(package.scala:27)
		at org.apache.spark.sql.sources.CommitFailureTestSource$$anon$1$$anon$2.close(CommitFailureTestSource.scala:62)
		at org.apache.spark.sql.execution.datasources.FileFormatWriter$DynamicPartitionWriteTask.releaseResources(FileFormatWriter.scala:408)
		at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$1.apply$mcV$sp(FileFormatWriter.scala:196)
		at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1365)
		... 9 more
[info] - call failure callbacks before close writer - partitioned (148 milliseconds)
[info] HiveUDFSuite:
19:02:58.471 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.2/target/tmp/warehouse-d2c6f265-1677-44e6-a335-120d64b4221b/src specified for non-external table:src
[info] - spark sql udf test that returns a struct (243 milliseconds)
[info] - SPARK-4785 When called with arguments referring column fields, PMOD throws NPE (125 milliseconds)
19:02:58.790 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.2/target/tmp/warehouse-d2c6f265-1677-44e6-a335-120d64b4221b/hiveudftesttable specified for non-external table:hiveudftesttable
[info] - hive struct udf (106 milliseconds)
[info] - Max/Min on named_struct (649 milliseconds)
[info] - SPARK-6409 UDAF Average test (303 milliseconds)
19:02:59.849 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.2/target/tmp/warehouse-d2c6f265-1677-44e6-a335-120d64b4221b/src specified for non-external table:src
[info] - SPARK-2693 udaf aggregates test (866 milliseconds)
[info] - SPARK-16228 Percentile needs explicit cast to double (13 milliseconds)
[info] - Generic UDAF aggregates (3 seconds, 59 milliseconds)
[info] - UDFIntegerToString (198 milliseconds)
[info] - UDFToListString (50 milliseconds)
[info] - UDFToListInt (69 milliseconds)
[info] - UDFToStringIntMap (105 milliseconds)
[info] - UDFToIntIntMap (60 milliseconds)
[info] - UDFListListInt (137 milliseconds)
[info] - UDFListString (113 milliseconds)
[info] - UDFStringString (178 milliseconds)
[info] - UDFTwoListList (177 milliseconds)
[info] - non-deterministic children of UDF (19 milliseconds)
[info] - non-deterministic children expressions of UDAF (24 milliseconds)
[info] - Hive UDFs with insufficient number of input arguments should trigger an analysis error (55 milliseconds)
[info] - Hive UDF in group by (262 milliseconds)
19:03:05.247 WARN org.apache.spark.sql.hive.HiveExternalCatalog: The table schema given by Hive metastore(struct<page_id:string,impressions:string>) is different from the schema when this table was created by Spark SQL(struct<page_id:int,impressions:int>). We have to fall back to the table schema from Hive metastore which is not case preserving.
19:03:05.341 WARN org.apache.spark.sql.hive.HiveExternalCatalog: The table schema given by Hive metastore(struct<page_id:string,impressions:string>) is different from the schema when this table was created by Spark SQL(struct<page_id:int,impressions:int>). We have to fall back to the table schema from Hive metastore which is not case preserving.
19:03:05.622 WARN org.apache.spark.sql.hive.HiveExternalCatalog: The table schema given by Hive metastore(struct<page_id:string,impressions:string>) is different from the schema when this table was created by Spark SQL(struct<page_id:int,impressions:int>). We have to fall back to the table schema from Hive metastore which is not case preserving.
19:03:05.625 WARN org.apache.spark.sql.hive.HiveExternalCatalog: The table schema given by Hive metastore(struct<page_id:string,impressions:string>) is different from the schema when this table was created by Spark SQL(struct<page_id:int,impressions:int>). We have to fall back to the table schema from Hive metastore which is not case preserving.
19:03:06.212 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.2/target/tmp/warehouse-d2c6f265-1677-44e6-a335-120d64b4221b/parquet_tmp specified for non-external table:parquet_tmp
[info] - SPARK-11522 select input_file_name from non-parquet table (1 second, 326 milliseconds)
[info] - Hive Stateful UDF (463 milliseconds)
[info] Test run started
[info] Test org.apache.spark.sql.hive.JavaDataFrameSuite.testUDAF started
[info] Test org.apache.spark.sql.hive.JavaDataFrameSuite.saveTableAndQueryIt started
[info] Test run finished: 0 failed, 0 ignored, 2 total, 1.346s
[info] Test run started
[info] Test org.apache.spark.sql.hive.JavaMetastoreDataSourcesSuite.saveExternalTableAndQueryIt started
19:03:08.513 WARN org.apache.spark.sql.execution.datasources.PartitioningAwareFileIndex: The directory file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.2/target/tmp/datasource-94764989-0ce1-43b1-99ac-197b4e0bed3f was not found. Was it deleted very recently?
19:03:08.583 WARN org.apache.spark.sql.hive.HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data source provider org.apache.spark.sql.json. Persisting data source table `default`.`javasavedtable` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
19:03:08.739 WARN org.apache.spark.sql.hive.HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data source provider org.apache.spark.sql.json. Persisting data source table `default`.`externaltable` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
[info] Test org.apache.spark.sql.hive.JavaMetastoreDataSourcesSuite.saveTableAndQueryIt started
19:03:09.020 WARN org.apache.spark.sql.execution.datasources.PartitioningAwareFileIndex: The directory file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.2/target/tmp/warehouse-d2c6f265-1677-44e6-a335-120d64b4221b/javasavedtable was not found. Was it deleted very recently?
19:03:09.111 WARN org.apache.spark.sql.hive.HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data source provider org.apache.spark.sql.json. Persisting data source table `default`.`javasavedtable` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
[info] Test org.apache.spark.sql.hive.JavaMetastoreDataSourcesSuite.saveExternalTableWithSchemaAndQueryIt started
19:03:09.224 WARN org.apache.spark.sql.execution.datasources.PartitioningAwareFileIndex: The directory file:/home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.2/target/tmp/datasource-c6ab0b6a-3ad8-464a-ba7d-c4df6d50d7ee was not found. Was it deleted very recently?
19:03:09.293 WARN org.apache.spark.sql.hive.HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data source provider org.apache.spark.sql.json. Persisting data source table `default`.`javasavedtable` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
19:03:09.419 WARN org.apache.spark.sql.hive.HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data source provider org.apache.spark.sql.json. Persisting data source table `default`.`externaltable` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
[info] Test run finished: 0 failed, 0 ignored, 3 total, 1.215s
[info] ScalaCheck
[info] Passed: Total 0, Failed 0, Errors 0, Passed 0
[info] Warning: Unknown ScalaCheck args provided: -oDF
[info] ScalaTest
[info] Run completed in 1 hour, 54 minutes, 2 seconds.
[info] Total number of tests run: 2311
[info] Suites: completed 72, aborted 0
[info] Tests: succeeded 2311, failed 0, canceled 0, ignored 593, pending 0
[info] All tests passed.
[info] Passed: Total 2316, Failed 0, Errors 0, Passed 2316, Ignored 593
[error] (streaming/test:test) sbt.TestsFailedException: Tests unsuccessful
[error] Total time: 7506 s, completed Aug 27, 2018 7:03:20 PM
[error] running /home/jenkins/workspace/spark-branch-2.1-test-sbt-hadoop-2.2/build/sbt -Phadoop-2.2 -Phive -Pyarn -Pmesos -Phive-thriftserver -Pkinesis-asl test ; received return code 1
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Finished: FAILURE