Test Result : DataFrameReaderWriterSuite

0 failures (±0)
47 tests (±0)
Took 50 sec.

All Tests

Test nameDurationStatus
Create table as select command should output correct schema: basic1 secPassed
Create table as select command should output correct schema: complex1.2 secPassed
Insert overwrite table command should output correct schema: basic1.2 secPassed
Insert overwrite table command should output correct schema: complex1.8 secPassed
SPARK-16848: table API throws an exception for user specified schema16 msPassed
SPARK-17230: write out results of decimal calculation0.97 secPassed
SPARK-18510: use user specified types for partition columns in file sources1.2 secPassed
SPARK-18899: append to a bucketed table using DataFrameWriter with mismatched bucketing1.9 secPassed
SPARK-18912: number of columns mismatch for non-file-based data source table76 msPassed
SPARK-18913: append to a table with special column names1.1 secPassed
SPARK-20431: Specify a schema by using a DDL-formatted string1 secPassed
SPARK-20460 Check name duplication in buckets95 msPassed
SPARK-20460 Check name duplication in schema7.5 secPassed
SPARK-29537: throw exception when user defined a wrong base path0.5 secPassed
Throw exception on unsafe cast with ANSI casting policy0.21 secPassed
Throw exception on unsafe table insertion with strict casting policy1.3 secPassed
check jdbc() does not support partitioning, bucketBy or sortBy26 msPassed
column nullability and comment - write and then read3.3 secPassed
csv - API and behavior regarding schema4 secPassed
json - API and behavior regarding schema2.7 secPassed
load API71 msPassed
options26 msPassed
orc - API and behavior regarding schema3.1 secPassed
parquet - API and behavior regarding schema3.4 secPassed
parquet - column nullability -- write only0.54 secPassed
pass partitionBy as options0.15 secPassed
prevent all column partitioning80 msPassed
read a data source that does not extend RelationProvider0.16 secPassed
read a data source that does not extend SchemaRelationProvider0.33 secPassed
resolve default source0.35 secPassed
resolve default source without extending SchemaRelationProvider0.3 secPassed
resolve full class29 msPassed
save mode0.1 secPassed
save mode for data source v20.27 secPassed
saveAsTable with mode Append should not fail if the table already exists and a same-name temp view exist1.4 secPassed
saveAsTable with mode Append should not fail if the table not exists but a same-name temp view exist0.54 secPassed
saveAsTable with mode ErrorIfExists should not fail if the table not exists but a same-name temp view exist0.32 secPassed
saveAsTable with mode Ignore should create the table if the table not exists but a same-name temp view exist0.23 secPassed
saveAsTable with mode Overwrite should not drop the temp view if the table not exists but a same-name temp view exist0.37 secPassed
saveAsTable with mode Overwrite should not fail if the table already exists and a same-name temp view exist0.98 secPassed
test different data types for options33 msPassed
test path option in load46 msPassed
text - API and behavior regarding schema1.8 secPassed
textFile - API and behavior regarding schema1.3 secPassed
use Spark jobs to list files1.3 secPassed
write path implements onTaskCommit API correctly0.5 secPassed
writeStream cannot be called on non-streaming datasets0.62 secPassed