SuccessConsole Output

Skipping 19,178 KB.. Full Log
ng deps/examples/ml/lda_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/linear_regression_with_elastic_net.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/linearsvc.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/logistic_regression_summary_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/logistic_regression_with_elastic_net.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/max_abs_scaler_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/min_hash_lsh_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/min_max_scaler_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/multiclass_logistic_regression_with_elastic_net.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/multilayer_perceptron_classification.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/n_gram_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/naive_bayes_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/normalizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/one_vs_rest_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/onehot_encoder_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/pca_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/pipeline_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/polynomial_expansion_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/power_iteration_clustering_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/prefixspan_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/quantile_discretizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/random_forest_classifier_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/random_forest_regressor_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/rformula_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/robust_scaler_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/sql_transformer.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/standard_scaler_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/stopwords_remover_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/string_indexer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/summarizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/tf_idf_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/tokenizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/train_validation_split.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_assembler_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_indexer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_size_hint_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_slicer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/word2vec_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/mllib/binary_classification_metrics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/bisecting_k_means_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/correlations.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/correlations_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/decision_tree_classification_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/decision_tree_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/elementwise_product_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/fpgrowth_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gaussian_mixture_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gaussian_mixture_model.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gradient_boosting_classification_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gradient_boosting_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/hypothesis_testing_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/hypothesis_testing_kolmogorov_smirnov_test_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/isotonic_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/k_means_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/kernel_density_estimation_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/kmeans.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/latent_dirichlet_allocation_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/linear_regression_with_sgd_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/logistic_regression.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/logistic_regression_with_lbfgs_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/multi_class_metrics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/multi_label_metrics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/naive_bayes_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/normalizer_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/pca_rowmatrix_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/power_iteration_clustering_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/random_forest_classification_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/random_forest_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/random_rdd_generation.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/ranking_metrics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/recommendation_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/regression_metrics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/sampled_rdds.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/standard_scaler_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/stratified_sampling_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/streaming_k_means_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/streaming_linear_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/summary_statistics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/svd_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/svm_with_sgd_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/tf_idf_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/word2vec.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/word2vec_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/sql/arrow.py -> pyspark-3.0.0.dev0/deps/examples/sql
copying deps/examples/sql/basic.py -> pyspark-3.0.0.dev0/deps/examples/sql
copying deps/examples/sql/datasource.py -> pyspark-3.0.0.dev0/deps/examples/sql
copying deps/examples/sql/hive.py -> pyspark-3.0.0.dev0/deps/examples/sql
copying deps/examples/sql/streaming/structured_kafka_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/sql/streaming
copying deps/examples/sql/streaming/structured_network_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/sql/streaming
copying deps/examples/sql/streaming/structured_network_wordcount_windowed.py -> pyspark-3.0.0.dev0/deps/examples/sql/streaming
copying deps/examples/streaming/hdfs_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/network_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/network_wordjoinsentiments.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/queue_stream.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/recoverable_network_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/sql_network_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/stateful_network_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/jars/JLargeArrays-1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/JTransforms-3.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/JavaEWAH-0.3.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/RoaringBitmap-0.7.45.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/ST4-4.0.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/activation-1.1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/aircompressor-0.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/algebra_2.12-2.0.0-M2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/antlr-2.7.7.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/antlr-runtime-3.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/antlr4-runtime-4.7.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/aopalliance-1.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/aopalliance-repackaged-2.5.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/apache-log4j-extras-1.2.17.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/apacheds-i18n-2.0.0-M15.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/apacheds-kerberos-codec-2.0.0-M15.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/api-asn1-api-1.0.0-M20.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/api-util-1.0.0-M20.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/arpack_combined_all-0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/arrow-format-0.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/arrow-memory-0.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/arrow-vector-0.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/audience-annotations-0.5.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/automaton-1.11-8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/avro-1.8.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/avro-ipc-1.8.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/avro-mapred-1.8.2-hadoop2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/azure-storage-2.0.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/bonecp-0.8.0.RELEASE.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/breeze-macros_2.12-1.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/breeze_2.12-1.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/cats-kernel_2.12-2.0.0-M4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/cglib-2.2.1-v20090111.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/chill-java-0.9.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/chill_2.12-0.9.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-beanutils-1.7.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-cli-1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-codec-1.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-collections-3.2.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-compiler-3.0.15.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-compress-1.8.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-configuration-1.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-crypto-1.0.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-dbcp-1.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-digester-1.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-httpclient-3.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-io-2.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-lang-2.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-lang3-3.8.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-logging-1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-math3-3.4.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-net-3.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-pool-1.5.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-text-1.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/compress-lzf-1.0.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/core-1.1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/curator-client-2.7.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/curator-framework-2.7.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/curator-recipes-2.7.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/datanucleus-api-jdo-3.2.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/datanucleus-core-3.2.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/datanucleus-rdbms-3.2.9.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/derby-10.12.1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/flatbuffers-java-1.9.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/generex-1.0.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/gmetric4j-1.0.7.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/gson-2.2.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/guava-14.0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/guice-3.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-annotations-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-auth-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-aws-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-azure-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-client-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-common-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-hdfs-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-app-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-common-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-core-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-jobclient-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-shuffle-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-openstack-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-api-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-client-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-common-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-server-common-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-server-web-proxy-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hive-beeline-1.2.1.spark2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hive-cli-1.2.1.spark2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hive-exec-1.2.1.spark2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hive-jdbc-1.2.1.spark2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hive-metastore-1.2.1.spark2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hk2-api-2.5.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hk2-locator-2.5.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hk2-utils-2.5.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hppc-0.7.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/htrace-core-3.1.0-incubating.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/httpclient-4.5.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/httpcore-4.4.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/istack-commons-runtime-3.0.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/ivy-2.4.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-annotations-2.9.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-core-2.9.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-core-asl-1.9.13.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-databind-2.9.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-dataformat-cbor-2.9.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-dataformat-yaml-2.9.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-jaxrs-1.9.13.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-mapper-asl-1.9.13.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-module-jaxb-annotations-2.9.9.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-module-paranamer-2.9.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-module-scala_2.12-2.9.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-xc-1.9.13.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jakarta.annotation-api-1.3.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jakarta.inject-2.5.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jakarta.ws.rs-api-2.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jakarta.xml.bind-api-2.3.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/janino-3.0.15.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javassist-3.22.0-CR2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javax.el-3.0.1-b11.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javax.inject-1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javax.servlet-api-3.1.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javolution-5.5.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jaxb-api-2.2.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jaxb-runtime-2.3.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jcl-over-slf4j-1.7.16.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jdo-api-3.0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-client-2.29.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-common-2.29.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-container-servlet-2.29.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-container-servlet-core-2.29.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-hk2-2.29.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-media-jaxb-2.29.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-server-2.29.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jettison-1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-6.1.26.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-client-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-continuation-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-http-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-io-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-jndi-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-plus-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-proxy-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-security-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-server-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-servlet-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-servlets-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-sslengine-6.1.26.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-util-6.1.26.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-util-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-webapp-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-xml-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jline-2.14.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/joda-time-2.9.9.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jodd-core-3.5.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jpam-1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/json4s-ast_2.12-3.6.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/json4s-core_2.12-3.6.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/json4s-jackson_2.12-3.6.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/json4s-scalap_2.12-3.6.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jsp-api-2.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jsr305-3.0.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jta-1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jul-to-slf4j-1.7.26.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/kryo-shaded-4.0.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/kubernetes-client-4.4.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/kubernetes-model-4.4.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/kubernetes-model-common-4.4.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/leveldbjni-all-1.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/libfb303-0.9.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/libthrift-0.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/log4j-1.2.17.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/logging-interceptor-3.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/lz4-java-1.6.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/machinist_2.12-0.6.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/macro-compat_2.12-1.1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/mesos-1.4.0-shaded-protobuf.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/metrics-core-3.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/metrics-ganglia-3.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/metrics-graphite-3.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/metrics-json-3.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/metrics-jvm-3.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/minlog-1.3.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/netty-all-4.1.42.Final.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/objenesis-2.5.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/okapi-shade-0.4.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/okhttp-3.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/okio-1.15.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/oncrpc-1.0.7.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/opencsv-2.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/orc-core-1.5.6-nohive.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/orc-mapreduce-1.5.6-nohive.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/orc-shims-1.5.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/oro-2.0.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/osgi-resource-locator-1.0.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/paranamer-2.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-column-1.10.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-common-1.10.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-encoding-1.10.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-format-2.4.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-hadoop-1.10.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-hadoop-bundle-1.6.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-jackson-1.10.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/pmml-model-1.4.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/protobuf-java-2.5.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/py4j-0.10.8.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/pyrolite-4.30.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-collection-compat_2.12-2.1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-compiler-2.12.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-library-2.12.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-parser-combinators_2.12-1.1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-reflect-2.12.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-xml_2.12-1.2.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/shapeless_2.12-2.3.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/shims-0.7.45.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/slf4j-api-1.7.26.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/slf4j-log4j12-1.7.25.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/snakeyaml-1.23.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/snappy-0.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/snappy-java-1.1.7.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-assembly_2.12-3.0.0-SNAPSHOT-tests.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-assembly_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-catalyst_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-core_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-cypher_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-ganglia-lgpl_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-graph-api_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-graph_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-graphx_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-hadoop-cloud_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-hive-thriftserver_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-hive_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-kubernetes_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-kvstore_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-launcher_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-mesos_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-mllib-local_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-mllib_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-network-common_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-network-shuffle_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-repl_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-sketch_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-sql_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-streaming_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-tags_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-unsafe_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-yarn_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spire-macros_2.12-0.17.0-M1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spire-platform_2.12-0.17.0-M1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spire-util_2.12-0.17.0-M1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spire_2.12-0.17.0-M1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spotbugs-annotations-3.1.9.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/stax-api-1.0-2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/stax-api-1.0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/stream-2.9.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/stringtemplate-3.2.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/super-csv-2.2.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/univocity-parsers-2.7.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/unused-1.0.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/validation-api-2.0.1.Final.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/xbean-asm7-shaded-4.14.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/xercesImpl-2.9.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/xml-apis-1.3.04.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/xmlenc-0.52.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/xz-1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/zjsonpatch-0.3.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/zookeeper-3.4.14.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/zstd-jni-1.4.3-1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/licenses/LICENSE-AnchorJS.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-CC0.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-bootstrap.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-cloudpickle.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-copybutton.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-d3.min.js.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-dagre-d3.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-datatables.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-graphlib-dot.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-heapq.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-join.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-jquery.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-json-formatter.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-matchMedia-polyfill.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-modernizr.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-mustache.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-py4j.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-respond.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-sbt-launch-lib.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-sorttable.js.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-vis.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/sbin/spark-config.sh -> pyspark-3.0.0.dev0/deps/sbin
copying deps/sbin/spark-daemon.sh -> pyspark-3.0.0.dev0/deps/sbin
copying deps/sbin/start-history-server.sh -> pyspark-3.0.0.dev0/deps/sbin
copying deps/sbin/stop-history-server.sh -> pyspark-3.0.0.dev0/deps/sbin
copying lib/py4j-0.10.8.1-src.zip -> pyspark-3.0.0.dev0/lib
copying lib/pyspark.zip -> pyspark-3.0.0.dev0/lib
copying pyspark/__init__.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/_globals.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/accumulators.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/broadcast.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/cloudpickle.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/conf.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/context.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/daemon.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/files.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/find_spark_home.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/heapq3.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/java_gateway.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/join.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/profiler.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/rdd.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/rddsampler.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/resourceinformation.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/resultiterable.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/serializers.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/shell.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/shuffle.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/statcounter.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/status.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/storagelevel.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/taskcontext.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/traceback_utils.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/util.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/version.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/worker.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark.egg-info/PKG-INFO -> pyspark-3.0.0.dev0/pyspark.egg-info
copying pyspark.egg-info/SOURCES.txt -> pyspark-3.0.0.dev0/pyspark.egg-info
copying pyspark.egg-info/dependency_links.txt -> pyspark-3.0.0.dev0/pyspark.egg-info
copying pyspark.egg-info/requires.txt -> pyspark-3.0.0.dev0/pyspark.egg-info
copying pyspark.egg-info/top_level.txt -> pyspark-3.0.0.dev0/pyspark.egg-info
copying pyspark/ml/__init__.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/base.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/classification.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/clustering.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/common.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/evaluation.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/feature.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/fpm.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/image.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/pipeline.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/recommendation.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/regression.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/stat.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/tree.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/tuning.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/util.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/wrapper.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/linalg/__init__.py -> pyspark-3.0.0.dev0/pyspark/ml/linalg
copying pyspark/ml/param/__init__.py -> pyspark-3.0.0.dev0/pyspark/ml/param
copying pyspark/ml/param/_shared_params_code_gen.py -> pyspark-3.0.0.dev0/pyspark/ml/param
copying pyspark/ml/param/shared.py -> pyspark-3.0.0.dev0/pyspark/ml/param
copying pyspark/mllib/__init__.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/classification.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/clustering.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/common.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/evaluation.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/feature.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/fpm.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/random.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/recommendation.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/regression.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/tree.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/util.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/linalg/__init__.py -> pyspark-3.0.0.dev0/pyspark/mllib/linalg
copying pyspark/mllib/linalg/distributed.py -> pyspark-3.0.0.dev0/pyspark/mllib/linalg
copying pyspark/mllib/stat/KernelDensity.py -> pyspark-3.0.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/__init__.py -> pyspark-3.0.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/_statistics.py -> pyspark-3.0.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/distribution.py -> pyspark-3.0.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/test.py -> pyspark-3.0.0.dev0/pyspark/mllib/stat
copying pyspark/python/pyspark/shell.py -> pyspark-3.0.0.dev0/pyspark/python/pyspark
copying pyspark/sql/__init__.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/catalog.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/cogroup.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/column.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/conf.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/context.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/dataframe.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/functions.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/group.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/readwriter.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/session.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/streaming.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/types.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/udf.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/utils.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/window.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/streaming/__init__.py -> pyspark-3.0.0.dev0/pyspark/streaming
copying pyspark/streaming/context.py -> pyspark-3.0.0.dev0/pyspark/streaming
copying pyspark/streaming/dstream.py -> pyspark-3.0.0.dev0/pyspark/streaming
copying pyspark/streaming/kinesis.py -> pyspark-3.0.0.dev0/pyspark/streaming
copying pyspark/streaming/listener.py -> pyspark-3.0.0.dev0/pyspark/streaming
copying pyspark/streaming/util.py -> pyspark-3.0.0.dev0/pyspark/streaming
Writing pyspark-3.0.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'pyspark-3.0.0.dev0' (and everything under it)
Installing dist into virtual env
Processing ./python/dist/pyspark-3.0.0.dev0.tar.gz
Collecting py4j==0.10.8.1 (from pyspark==3.0.0.dev0)
  Downloading https://files.pythonhosted.org/packages/04/de/2d314a921ef4c20b283e1de94e0780273678caac901564df06b948e4ba9b/py4j-0.10.8.1-py2.py3-none-any.whl (196kB)
mkl-random 1.0.1 requires cython, which is not installed.
Installing collected packages: py4j, pyspark
  Running setup.py install for pyspark: started
    Running setup.py install for pyspark: finished with status 'done'
Successfully installed py4j-0.10.8.1 pyspark-3.0.0.dev0
You are using pip version 10.0.1, however version 19.2.3 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
Run basic sanity check on pip installed version with spark-submit
19/10/12 10:29:59 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
19/10/12 10:29:59 INFO SparkContext: Running Spark version 3.0.0-SNAPSHOT
19/10/12 10:29:59 INFO ResourceUtils: ==============================================================
19/10/12 10:29:59 INFO ResourceUtils: Resources for spark.driver:

19/10/12 10:29:59 INFO ResourceUtils: ==============================================================
19/10/12 10:29:59 INFO SparkContext: Submitted application: PipSanityCheck
19/10/12 10:30:00 INFO SecurityManager: Changing view acls to: jenkins
19/10/12 10:30:00 INFO SecurityManager: Changing modify acls to: jenkins
19/10/12 10:30:00 INFO SecurityManager: Changing view acls groups to: 
19/10/12 10:30:00 INFO SecurityManager: Changing modify acls groups to: 
19/10/12 10:30:00 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
19/10/12 10:30:00 INFO Utils: Successfully started service 'sparkDriver' on port 46474.
19/10/12 10:30:00 INFO SparkEnv: Registering MapOutputTracker
19/10/12 10:30:00 INFO SparkEnv: Registering BlockManagerMaster
19/10/12 10:30:00 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
19/10/12 10:30:00 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
19/10/12 10:30:00 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-3a858f29-0630-44e6-9d6d-34684d3aec27
19/10/12 10:30:00 INFO MemoryStore: MemoryStore started with capacity 366.3 MiB
19/10/12 10:30:00 INFO SparkEnv: Registering OutputCommitCoordinator
19/10/12 10:30:00 INFO log: Logging initialized @2694ms to org.eclipse.jetty.util.log.Slf4jLog
19/10/12 10:30:00 INFO Server: jetty-9.4.18.v20190429; built: 2019-04-29T20:42:08.989Z; git: e1bc35120a6617ee3df052294e433f3a25ce7097; jvm 1.8.0_191-b12
19/10/12 10:30:00 INFO Server: Started @2795ms
19/10/12 10:30:00 INFO AbstractConnector: Started ServerConnector@dba1d01{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
19/10/12 10:30:00 INFO Utils: Successfully started service 'SparkUI' on port 4040.
19/10/12 10:30:00 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5b649842{/jobs,null,AVAILABLE,@Spark}
19/10/12 10:30:00 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@9713cd1{/jobs/json,null,AVAILABLE,@Spark}
19/10/12 10:30:00 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@d396a71{/jobs/job,null,AVAILABLE,@Spark}
19/10/12 10:30:00 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@37dc61e0{/jobs/job/json,null,AVAILABLE,@Spark}
19/10/12 10:30:00 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@70b7d0cb{/stages,null,AVAILABLE,@Spark}
19/10/12 10:30:00 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@1c96e530{/stages/json,null,AVAILABLE,@Spark}
19/10/12 10:30:00 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@2493619b{/stages/stage,null,AVAILABLE,@Spark}
19/10/12 10:30:00 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@7ecd1bac{/stages/stage/json,null,AVAILABLE,@Spark}
19/10/12 10:30:00 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@73e0ece9{/stages/pool,null,AVAILABLE,@Spark}
19/10/12 10:30:00 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@4d8eef3d{/stages/pool/json,null,AVAILABLE,@Spark}
19/10/12 10:30:00 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@52d04e62{/storage,null,AVAILABLE,@Spark}
19/10/12 10:30:00 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@77f61e69{/storage/json,null,AVAILABLE,@Spark}
19/10/12 10:30:00 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@1ad4b9a5{/storage/rdd,null,AVAILABLE,@Spark}
19/10/12 10:30:00 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@4f003b6b{/storage/rdd/json,null,AVAILABLE,@Spark}
19/10/12 10:30:00 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@1fd130{/environment,null,AVAILABLE,@Spark}
19/10/12 10:30:00 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@55196b9f{/environment/json,null,AVAILABLE,@Spark}
19/10/12 10:30:00 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5e4a3afc{/executors,null,AVAILABLE,@Spark}
19/10/12 10:30:00 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@3d6788c8{/executors/json,null,AVAILABLE,@Spark}
19/10/12 10:30:00 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@52c96ad{/executors/threadDump,null,AVAILABLE,@Spark}
19/10/12 10:30:00 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@45a44cf9{/executors/threadDump/json,null,AVAILABLE,@Spark}
19/10/12 10:30:00 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5818bcce{/static,null,AVAILABLE,@Spark}
19/10/12 10:30:00 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@56d002b9{/,null,AVAILABLE,@Spark}
19/10/12 10:30:00 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@6f4c73d0{/api,null,AVAILABLE,@Spark}
19/10/12 10:30:00 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5f950ccf{/jobs/job/kill,null,AVAILABLE,@Spark}
19/10/12 10:30:00 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@8199656{/stages/stage/kill,null,AVAILABLE,@Spark}
19/10/12 10:30:00 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://amp-jenkins-worker-03.amp:4040
19/10/12 10:30:00 INFO Executor: Starting executor ID driver on host amp-jenkins-worker-03.amp
19/10/12 10:30:00 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 33585.
19/10/12 10:30:00 INFO NettyBlockTransferService: Server created on amp-jenkins-worker-03.amp:33585
19/10/12 10:30:00 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
19/10/12 10:30:00 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, amp-jenkins-worker-03.amp, 33585, None)
19/10/12 10:30:00 INFO BlockManagerMasterEndpoint: Registering block manager amp-jenkins-worker-03.amp:33585 with 366.3 MiB RAM, BlockManagerId(driver, amp-jenkins-worker-03.amp, 33585, None)
19/10/12 10:30:00 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, amp-jenkins-worker-03.amp, 33585, None)
19/10/12 10:30:00 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, amp-jenkins-worker-03.amp, 33585, None)
19/10/12 10:30:00 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@118788ad{/metrics/json,null,AVAILABLE,@Spark}
19/10/12 10:30:01 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/spark-warehouse').
19/10/12 10:30:01 INFO SharedState: Warehouse path is 'file:/spark-warehouse'.
19/10/12 10:30:01 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@934a8ac{/SQL,null,AVAILABLE,@Spark}
19/10/12 10:30:01 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5ce1d129{/SQL/json,null,AVAILABLE,@Spark}
19/10/12 10:30:01 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@38f3a247{/SQL/execution,null,AVAILABLE,@Spark}
19/10/12 10:30:01 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@2877a66b{/SQL/execution/json,null,AVAILABLE,@Spark}
19/10/12 10:30:01 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@7291e35c{/static/sql,null,AVAILABLE,@Spark}
19/10/12 10:30:01 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
19/10/12 10:30:02 INFO SparkContext: Starting job: reduce at /home/jenkins/workspace/NewSparkPullRequestBuilder/dev/pip-sanity-check.py:31
19/10/12 10:30:02 INFO DAGScheduler: Got job 0 (reduce at /home/jenkins/workspace/NewSparkPullRequestBuilder/dev/pip-sanity-check.py:31) with 10 output partitions
19/10/12 10:30:02 INFO DAGScheduler: Final stage: ResultStage 0 (reduce at /home/jenkins/workspace/NewSparkPullRequestBuilder/dev/pip-sanity-check.py:31)
19/10/12 10:30:02 INFO DAGScheduler: Parents of final stage: List()
19/10/12 10:30:02 INFO DAGScheduler: Missing parents: List()
19/10/12 10:30:02 INFO DAGScheduler: Submitting ResultStage 0 (PythonRDD[1] at reduce at /home/jenkins/workspace/NewSparkPullRequestBuilder/dev/pip-sanity-check.py:31), which has no missing parents
19/10/12 10:30:02 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 6.3 KiB, free 366.3 MiB)
19/10/12 10:30:02 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 4.0 KiB, free 366.3 MiB)
19/10/12 10:30:02 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on amp-jenkins-worker-03.amp:33585 (size: 4.0 KiB, free: 366.3 MiB)
19/10/12 10:30:02 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1196
19/10/12 10:30:02 INFO DAGScheduler: Submitting 10 missing tasks from ResultStage 0 (PythonRDD[1] at reduce at /home/jenkins/workspace/NewSparkPullRequestBuilder/dev/pip-sanity-check.py:31) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7, 8, 9))
19/10/12 10:30:02 INFO TaskSchedulerImpl: Adding task set 0.0 with 10 tasks
19/10/12 10:30:02 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, amp-jenkins-worker-03.amp, executor driver, partition 0, PROCESS_LOCAL, 7333 bytes)
19/10/12 10:30:02 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, amp-jenkins-worker-03.amp, executor driver, partition 1, PROCESS_LOCAL, 7333 bytes)
19/10/12 10:30:02 INFO TaskSetManager: Starting task 2.0 in stage 0.0 (TID 2, amp-jenkins-worker-03.amp, executor driver, partition 2, PROCESS_LOCAL, 7333 bytes)
19/10/12 10:30:02 INFO TaskSetManager: Starting task 3.0 in stage 0.0 (TID 3, amp-jenkins-worker-03.amp, executor driver, partition 3, PROCESS_LOCAL, 7333 bytes)
19/10/12 10:30:02 INFO TaskSetManager: Starting task 4.0 in stage 0.0 (TID 4, amp-jenkins-worker-03.amp, executor driver, partition 4, PROCESS_LOCAL, 7333 bytes)
19/10/12 10:30:02 INFO TaskSetManager: Starting task 5.0 in stage 0.0 (TID 5, amp-jenkins-worker-03.amp, executor driver, partition 5, PROCESS_LOCAL, 7333 bytes)
19/10/12 10:30:02 INFO TaskSetManager: Starting task 6.0 in stage 0.0 (TID 6, amp-jenkins-worker-03.amp, executor driver, partition 6, PROCESS_LOCAL, 7333 bytes)
19/10/12 10:30:02 INFO TaskSetManager: Starting task 7.0 in stage 0.0 (TID 7, amp-jenkins-worker-03.amp, executor driver, partition 7, PROCESS_LOCAL, 7333 bytes)
19/10/12 10:30:02 INFO TaskSetManager: Starting task 8.0 in stage 0.0 (TID 8, amp-jenkins-worker-03.amp, executor driver, partition 8, PROCESS_LOCAL, 7333 bytes)
19/10/12 10:30:02 INFO TaskSetManager: Starting task 9.0 in stage 0.0 (TID 9, amp-jenkins-worker-03.amp, executor driver, partition 9, PROCESS_LOCAL, 7333 bytes)
19/10/12 10:30:02 INFO Executor: Running task 8.0 in stage 0.0 (TID 8)
19/10/12 10:30:02 INFO Executor: Running task 3.0 in stage 0.0 (TID 3)
19/10/12 10:30:02 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
19/10/12 10:30:02 INFO Executor: Running task 2.0 in stage 0.0 (TID 2)
19/10/12 10:30:02 INFO Executor: Running task 7.0 in stage 0.0 (TID 7)
19/10/12 10:30:02 INFO Executor: Running task 6.0 in stage 0.0 (TID 6)
19/10/12 10:30:02 INFO Executor: Running task 9.0 in stage 0.0 (TID 9)
19/10/12 10:30:02 INFO Executor: Running task 4.0 in stage 0.0 (TID 4)
19/10/12 10:30:02 INFO Executor: Running task 5.0 in stage 0.0 (TID 5)
19/10/12 10:30:02 INFO Executor: Running task 1.0 in stage 0.0 (TID 1)
19/10/12 10:30:03 INFO PythonRunner: Times: total = 427, boot = 382, init = 45, finish = 0
19/10/12 10:30:03 INFO PythonRunner: Times: total = 427, boot = 384, init = 43, finish = 0
19/10/12 10:30:03 INFO PythonRunner: Times: total = 428, boot = 373, init = 54, finish = 1
19/10/12 10:30:03 INFO PythonRunner: Times: total = 428, boot = 379, init = 48, finish = 1
19/10/12 10:30:03 INFO PythonRunner: Times: total = 432, boot = 388, init = 43, finish = 1
19/10/12 10:30:03 INFO PythonRunner: Times: total = 434, boot = 392, init = 42, finish = 0
19/10/12 10:30:03 INFO PythonRunner: Times: total = 439, boot = 396, init = 43, finish = 0
19/10/12 10:30:03 INFO PythonRunner: Times: total = 442, boot = 400, init = 42, finish = 0
19/10/12 10:30:03 INFO PythonRunner: Times: total = 448, boot = 405, init = 43, finish = 0
19/10/12 10:30:03 INFO PythonRunner: Times: total = 451, boot = 409, init = 42, finish = 0
19/10/12 10:30:03 INFO Executor: Finished task 2.0 in stage 0.0 (TID 2). 1549 bytes result sent to driver
19/10/12 10:30:03 INFO Executor: Finished task 8.0 in stage 0.0 (TID 8). 1550 bytes result sent to driver
19/10/12 10:30:03 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 1549 bytes result sent to driver
19/10/12 10:30:03 INFO Executor: Finished task 6.0 in stage 0.0 (TID 6). 1550 bytes result sent to driver
19/10/12 10:30:03 INFO Executor: Finished task 9.0 in stage 0.0 (TID 9). 1550 bytes result sent to driver
19/10/12 10:30:03 INFO Executor: Finished task 1.0 in stage 0.0 (TID 1). 1549 bytes result sent to driver
19/10/12 10:30:03 INFO Executor: Finished task 3.0 in stage 0.0 (TID 3). 1550 bytes result sent to driver
19/10/12 10:30:03 INFO Executor: Finished task 5.0 in stage 0.0 (TID 5). 1550 bytes result sent to driver
19/10/12 10:30:03 INFO Executor: Finished task 4.0 in stage 0.0 (TID 4). 1550 bytes result sent to driver
19/10/12 10:30:03 INFO Executor: Finished task 7.0 in stage 0.0 (TID 7). 1550 bytes result sent to driver
19/10/12 10:30:03 INFO TaskSetManager: Finished task 8.0 in stage 0.0 (TID 8) in 996 ms on amp-jenkins-worker-03.amp (executor driver) (1/10)
19/10/12 10:30:03 INFO TaskSetManager: Finished task 2.0 in stage 0.0 (TID 2) in 1006 ms on amp-jenkins-worker-03.amp (executor driver) (2/10)
19/10/12 10:30:03 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 1041 ms on amp-jenkins-worker-03.amp (executor driver) (3/10)
19/10/12 10:30:03 INFO PythonAccumulatorV2: Connected to AccumulatorServer at host: 127.0.0.1 port: 37644
19/10/12 10:30:03 INFO TaskSetManager: Finished task 6.0 in stage 0.0 (TID 6) in 1008 ms on amp-jenkins-worker-03.amp (executor driver) (4/10)
19/10/12 10:30:03 INFO TaskSetManager: Finished task 9.0 in stage 0.0 (TID 9) in 1006 ms on amp-jenkins-worker-03.amp (executor driver) (5/10)
19/10/12 10:30:03 INFO TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 1014 ms on amp-jenkins-worker-03.amp (executor driver) (6/10)
19/10/12 10:30:03 INFO TaskSetManager: Finished task 3.0 in stage 0.0 (TID 3) in 1014 ms on amp-jenkins-worker-03.amp (executor driver) (7/10)
19/10/12 10:30:03 INFO TaskSetManager: Finished task 5.0 in stage 0.0 (TID 5) in 1012 ms on amp-jenkins-worker-03.amp (executor driver) (8/10)
19/10/12 10:30:03 INFO TaskSetManager: Finished task 4.0 in stage 0.0 (TID 4) in 1013 ms on amp-jenkins-worker-03.amp (executor driver) (9/10)
19/10/12 10:30:03 INFO TaskSetManager: Finished task 7.0 in stage 0.0 (TID 7) in 1012 ms on amp-jenkins-worker-03.amp (executor driver) (10/10)
19/10/12 10:30:03 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
19/10/12 10:30:03 INFO DAGScheduler: ResultStage 0 (reduce at /home/jenkins/workspace/NewSparkPullRequestBuilder/dev/pip-sanity-check.py:31) finished in 1.213 s
19/10/12 10:30:03 INFO DAGScheduler: Job 0 is finished. Cancelling potential speculative or zombie tasks for this job
19/10/12 10:30:03 INFO TaskSchedulerImpl: Killing all running tasks in stage 0: Stage finished
19/10/12 10:30:03 INFO DAGScheduler: Job 0 finished: reduce at /home/jenkins/workspace/NewSparkPullRequestBuilder/dev/pip-sanity-check.py:31, took 1.270871 s
Successfully ran pip sanity check
19/10/12 10:30:03 INFO AbstractConnector: Stopped Spark@dba1d01{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
19/10/12 10:30:03 INFO SparkUI: Stopped Spark web UI at http://amp-jenkins-worker-03.amp:4040
19/10/12 10:30:03 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/10/12 10:30:03 INFO MemoryStore: MemoryStore cleared
19/10/12 10:30:03 INFO BlockManager: BlockManager stopped
19/10/12 10:30:03 INFO BlockManagerMaster: BlockManagerMaster stopped
19/10/12 10:30:03 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/10/12 10:30:03 INFO SparkContext: Successfully stopped SparkContext
19/10/12 10:30:04 INFO ShutdownHookManager: Shutdown hook called
19/10/12 10:30:04 INFO ShutdownHookManager: Deleting directory /tmp/spark-da357504-d1d8-4fe0-b6a5-ae3a47d3b41e/pyspark-29cbf9a6-791a-4620-9a01-6f2248c2e248
19/10/12 10:30:04 INFO ShutdownHookManager: Deleting directory /tmp/spark-da357504-d1d8-4fe0-b6a5-ae3a47d3b41e
19/10/12 10:30:04 INFO ShutdownHookManager: Deleting directory /tmp/spark-fc45c8a1-6fd8-4cbc-a96d-ab80022734fa
Run basic sanity check with import based
19/10/12 10:30:06 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

[Stage 0:>                                                        (0 + 10) / 10]
                                                                                
Successfully ran pip sanity check
Run the tests for context.py
19/10/12 10:30:14 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
19/10/12 10:30:18 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

[Stage 0:>                                                          (0 + 4) / 4]
[Stage 0:==============>                                            (1 + 3) / 4]
                                                                                

[Stage 10:>                                                         (0 + 4) / 4]
[Stage 10:>                 (0 + 4) / 4][Stage 11:>                 (0 + 0) / 2]19/10/12 10:30:31 WARN PythonRunner: Incomplete task 3.0 in stage 10 (TID 42) interrupted: Attempting to kill Python Worker
19/10/12 10:30:31 WARN PythonRunner: Incomplete task 1.0 in stage 10 (TID 40) interrupted: Attempting to kill Python Worker
19/10/12 10:30:31 WARN PythonRunner: Incomplete task 2.0 in stage 10 (TID 41) interrupted: Attempting to kill Python Worker
19/10/12 10:30:31 WARN PythonRunner: Incomplete task 0.0 in stage 10 (TID 39) interrupted: Attempting to kill Python Worker
19/10/12 10:30:31 WARN TaskSetManager: Lost task 3.0 in stage 10.0 (TID 42, amp-jenkins-worker-03.amp, executor driver): TaskKilled (Stage cancelled)
19/10/12 10:30:31 WARN TaskSetManager: Lost task 1.0 in stage 10.0 (TID 40, amp-jenkins-worker-03.amp, executor driver): TaskKilled (Stage cancelled)
19/10/12 10:30:31 WARN TaskSetManager: Lost task 2.0 in stage 10.0 (TID 41, amp-jenkins-worker-03.amp, executor driver): TaskKilled (Stage cancelled)
19/10/12 10:30:31 WARN TaskSetManager: Lost task 0.0 in stage 10.0 (TID 39, amp-jenkins-worker-03.amp, executor driver): TaskKilled (Stage cancelled)

                                                                                
Testing pip installation with python 3.5
Using /tmp/tmp.6dJoIguyJJ for virtualenv
Fetching package metadata ...........
Solving package specifications: .

Package plan for installation in environment /tmp/tmp.6dJoIguyJJ/3.5:

The following NEW packages will be INSTALLED:

    _libgcc_mutex:   0.1-main               
    blas:            1.0-mkl                
    ca-certificates: 2019.8.28-0            
    certifi:         2018.8.24-py35_1       
    intel-openmp:    2019.4-243             
    libedit:         3.1.20181209-hc058e9b_0
    libffi:          3.2.1-hd88cf55_4       
    libgcc-ng:       9.1.0-hdf63c60_0       
    libgfortran-ng:  7.3.0-hdf63c60_0       
    libstdcxx-ng:    9.1.0-hdf63c60_0       
    mkl:             2018.0.3-1             
    mkl_fft:         1.0.6-py35h7dd41cf_0   
    mkl_random:      1.0.1-py35h4414c95_1   
    ncurses:         6.1-he6710b0_1         
    numpy:           1.15.2-py35h1d66e8a_0  
    numpy-base:      1.15.2-py35h81de0dd_0  
    openssl:         1.0.2t-h7b6447c_1      
    pandas:          0.23.4-py35h04863e7_0  
    pip:             10.0.1-py35_0          
    python:          3.5.6-hc3d631a_0       
    python-dateutil: 2.7.3-py35_0           
    pytz:            2019.3-py_0            
    readline:        7.0-h7b6447c_5         
    setuptools:      40.2.0-py35_0          
    six:             1.11.0-py35_1          
    sqlite:          3.30.0-h7b6447c_0      
    tk:              8.6.8-hbc83047_0       
    wheel:           0.31.1-py35_0          
    xz:              5.2.4-h14c3975_4       
    zlib:            1.2.11-h7b6447c_3      

#
# To activate this environment, use:
# > source activate /tmp/tmp.6dJoIguyJJ/3.5
#
# To deactivate an active environment, use:
# > source deactivate
#

Creating pip installable source dist
running sdist
running egg_info
creating pyspark.egg-info
writing requirements to pyspark.egg-info/requires.txt
writing pyspark.egg-info/PKG-INFO
writing top-level names to pyspark.egg-info/top_level.txt
writing dependency_links to pyspark.egg-info/dependency_links.txt
writing manifest file 'pyspark.egg-info/SOURCES.txt'
Could not import pypandoc - required to package PySpark
package init file 'deps/bin/__init__.py' not found (or not a regular file)
package init file 'deps/sbin/__init__.py' not found (or not a regular file)
package init file 'deps/jars/__init__.py' not found (or not a regular file)
package init file 'pyspark/python/pyspark/__init__.py' not found (or not a regular file)
package init file 'lib/__init__.py' not found (or not a regular file)
package init file 'deps/data/__init__.py' not found (or not a regular file)
package init file 'deps/licenses/__init__.py' not found (or not a regular file)
package init file 'deps/examples/__init__.py' not found (or not a regular file)
reading manifest file 'pyspark.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no previously-included files matching '*.py[cod]' found anywhere in distribution
warning: no previously-included files matching '__pycache__' found anywhere in distribution
warning: no previously-included files matching '.DS_Store' found anywhere in distribution
writing manifest file 'pyspark.egg-info/SOURCES.txt'
running check
creating pyspark-3.0.0.dev0
creating pyspark-3.0.0.dev0/deps
creating pyspark-3.0.0.dev0/deps/bin
creating pyspark-3.0.0.dev0/deps/data
creating pyspark-3.0.0.dev0/deps/data/graphx
creating pyspark-3.0.0.dev0/deps/data/mllib
creating pyspark-3.0.0.dev0/deps/data/mllib/als
creating pyspark-3.0.0.dev0/deps/data/mllib/images
creating pyspark-3.0.0.dev0/deps/data/mllib/images/origin
creating pyspark-3.0.0.dev0/deps/data/mllib/images/origin/kittens
creating pyspark-3.0.0.dev0/deps/data/mllib/images/partitioned
creating pyspark-3.0.0.dev0/deps/data/mllib/images/partitioned/cls=kittens
creating pyspark-3.0.0.dev0/deps/data/mllib/images/partitioned/cls=kittens/date=2018-01
creating pyspark-3.0.0.dev0/deps/data/mllib/ridge-data
creating pyspark-3.0.0.dev0/deps/data/streaming
creating pyspark-3.0.0.dev0/deps/examples
creating pyspark-3.0.0.dev0/deps/examples/ml
creating pyspark-3.0.0.dev0/deps/examples/mllib
creating pyspark-3.0.0.dev0/deps/examples/sql
creating pyspark-3.0.0.dev0/deps/examples/sql/streaming
creating pyspark-3.0.0.dev0/deps/examples/streaming
creating pyspark-3.0.0.dev0/deps/jars
creating pyspark-3.0.0.dev0/deps/licenses
creating pyspark-3.0.0.dev0/deps/sbin
creating pyspark-3.0.0.dev0/lib
creating pyspark-3.0.0.dev0/pyspark
creating pyspark-3.0.0.dev0/pyspark.egg-info
creating pyspark-3.0.0.dev0/pyspark/ml
creating pyspark-3.0.0.dev0/pyspark/ml/linalg
creating pyspark-3.0.0.dev0/pyspark/ml/param
creating pyspark-3.0.0.dev0/pyspark/mllib
creating pyspark-3.0.0.dev0/pyspark/mllib/linalg
creating pyspark-3.0.0.dev0/pyspark/mllib/stat
creating pyspark-3.0.0.dev0/pyspark/python
creating pyspark-3.0.0.dev0/pyspark/python/pyspark
creating pyspark-3.0.0.dev0/pyspark/sql
creating pyspark-3.0.0.dev0/pyspark/streaming
copying files to pyspark-3.0.0.dev0...
copying MANIFEST.in -> pyspark-3.0.0.dev0
copying README.md -> pyspark-3.0.0.dev0
copying setup.cfg -> pyspark-3.0.0.dev0
copying setup.py -> pyspark-3.0.0.dev0
copying deps/bin/beeline -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/beeline.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/docker-image-tool.sh -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/find-spark-home -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/find-spark-home.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/load-spark-env.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/load-spark-env.sh -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/pyspark -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/pyspark.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/pyspark2.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/run-example -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/run-example.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-class -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-class.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-class2.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-shell -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-shell.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-shell2.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-sql -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-sql.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-sql2.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-submit -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-submit.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-submit2.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/sparkR -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/sparkR.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/sparkR2.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/data/graphx/followers.txt -> pyspark-3.0.0.dev0/deps/data/graphx
copying deps/data/graphx/users.txt -> pyspark-3.0.0.dev0/deps/data/graphx
copying deps/data/mllib/gmm_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/iris_libsvm.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/kmeans_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/pagerank_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/pic_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_binary_classification_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_fpgrowth.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_isotonic_regression_libsvm_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_kmeans_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_lda_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_lda_libsvm_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_libsvm_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_linear_regression_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_movielens_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_multiclass_classification_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_svm_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/streaming_kmeans_data_test.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/als/sample_movielens_ratings.txt -> pyspark-3.0.0.dev0/deps/data/mllib/als
copying deps/data/mllib/als/test.data -> pyspark-3.0.0.dev0/deps/data/mllib/als
copying deps/data/mllib/images/license.txt -> pyspark-3.0.0.dev0/deps/data/mllib/images
copying deps/data/mllib/images/origin/license.txt -> pyspark-3.0.0.dev0/deps/data/mllib/images/origin
copying deps/data/mllib/images/origin/kittens/not-image.txt -> pyspark-3.0.0.dev0/deps/data/mllib/images/origin/kittens
copying deps/data/mllib/images/partitioned/cls=kittens/date=2018-01/not-image.txt -> pyspark-3.0.0.dev0/deps/data/mllib/images/partitioned/cls=kittens/date=2018-01
copying deps/data/mllib/ridge-data/lpsa.data -> pyspark-3.0.0.dev0/deps/data/mllib/ridge-data
copying deps/data/streaming/AFINN-111.txt -> pyspark-3.0.0.dev0/deps/data/streaming
copying deps/examples/als.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/avro_inputformat.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/kmeans.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/logistic_regression.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/pagerank.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/parquet_inputformat.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/pi.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/sort.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/status_api_demo.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/transitive_closure.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/wordcount.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/ml/aft_survival_regression.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/als_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/binarizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/bisecting_k_means_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/bucketed_random_projection_lsh_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/bucketizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/chi_square_test_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/chisq_selector_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/correlation_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/count_vectorizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/cross_validator.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/dataframe_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/dct_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/decision_tree_classification_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/decision_tree_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/elementwise_product_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/estimator_transformer_param_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/feature_hasher_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/fpgrowth_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/gaussian_mixture_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/generalized_linear_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/gradient_boosted_tree_classifier_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/gradient_boosted_tree_regressor_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/imputer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/index_to_string_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/interaction_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/isotonic_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/kmeans_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/lda_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/linear_regression_with_elastic_net.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/linearsvc.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/logistic_regression_summary_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/logistic_regression_with_elastic_net.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/max_abs_scaler_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/min_hash_lsh_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/min_max_scaler_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/multiclass_logistic_regression_with_elastic_net.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/multilayer_perceptron_classification.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/n_gram_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/naive_bayes_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/normalizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/one_vs_rest_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/onehot_encoder_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/pca_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/pipeline_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/polynomial_expansion_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/power_iteration_clustering_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/prefixspan_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/quantile_discretizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/random_forest_classifier_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/random_forest_regressor_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/rformula_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/robust_scaler_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/sql_transformer.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/standard_scaler_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/stopwords_remover_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/string_indexer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/summarizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/tf_idf_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/tokenizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/train_validation_split.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_assembler_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_indexer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_size_hint_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_slicer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/word2vec_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/mllib/binary_classification_metrics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/bisecting_k_means_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/correlations.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/correlations_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/decision_tree_classification_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/decision_tree_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/elementwise_product_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/fpgrowth_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gaussian_mixture_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gaussian_mixture_model.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gradient_boosting_classification_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gradient_boosting_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/hypothesis_testing_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/hypothesis_testing_kolmogorov_smirnov_test_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/isotonic_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/k_means_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/kernel_density_estimation_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/kmeans.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/latent_dirichlet_allocation_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/linear_regression_with_sgd_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/logistic_regression.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/logistic_regression_with_lbfgs_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/multi_class_metrics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/multi_label_metrics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/naive_bayes_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/normalizer_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/pca_rowmatrix_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/power_iteration_clustering_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/random_forest_classification_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/random_forest_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/random_rdd_generation.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/ranking_metrics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/recommendation_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/regression_metrics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/sampled_rdds.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/standard_scaler_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/stratified_sampling_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/streaming_k_means_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/streaming_linear_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/summary_statistics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/svd_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/svm_with_sgd_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/tf_idf_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/word2vec.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/word2vec_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/sql/arrow.py -> pyspark-3.0.0.dev0/deps/examples/sql
copying deps/examples/sql/basic.py -> pyspark-3.0.0.dev0/deps/examples/sql
copying deps/examples/sql/datasource.py -> pyspark-3.0.0.dev0/deps/examples/sql
copying deps/examples/sql/hive.py -> pyspark-3.0.0.dev0/deps/examples/sql
copying deps/examples/sql/streaming/structured_kafka_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/sql/streaming
copying deps/examples/sql/streaming/structured_network_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/sql/streaming
copying deps/examples/sql/streaming/structured_network_wordcount_windowed.py -> pyspark-3.0.0.dev0/deps/examples/sql/streaming
copying deps/examples/streaming/hdfs_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/network_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/network_wordjoinsentiments.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/queue_stream.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/recoverable_network_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/sql_network_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/stateful_network_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/jars/JLargeArrays-1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/JTransforms-3.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/JavaEWAH-0.3.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/RoaringBitmap-0.7.45.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/ST4-4.0.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/activation-1.1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/aircompressor-0.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/algebra_2.12-2.0.0-M2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/antlr-2.7.7.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/antlr-runtime-3.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/antlr4-runtime-4.7.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/aopalliance-1.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/aopalliance-repackaged-2.5.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/apache-log4j-extras-1.2.17.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/apacheds-i18n-2.0.0-M15.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/apacheds-kerberos-codec-2.0.0-M15.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/api-asn1-api-1.0.0-M20.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/api-util-1.0.0-M20.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/arpack_combined_all-0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/arrow-format-0.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/arrow-memory-0.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/arrow-vector-0.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/audience-annotations-0.5.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/automaton-1.11-8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/avro-1.8.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/avro-ipc-1.8.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/avro-mapred-1.8.2-hadoop2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/azure-storage-2.0.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/bonecp-0.8.0.RELEASE.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/breeze-macros_2.12-1.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/breeze_2.12-1.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/cats-kernel_2.12-2.0.0-M4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/cglib-2.2.1-v20090111.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/chill-java-0.9.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/chill_2.12-0.9.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-beanutils-1.7.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-cli-1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-codec-1.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-collections-3.2.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-compiler-3.0.15.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-compress-1.8.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-configuration-1.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-crypto-1.0.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-dbcp-1.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-digester-1.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-httpclient-3.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-io-2.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-lang-2.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-lang3-3.8.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-logging-1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-math3-3.4.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-net-3.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-pool-1.5.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-text-1.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/compress-lzf-1.0.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/core-1.1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/curator-client-2.7.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/curator-framework-2.7.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/curator-recipes-2.7.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/datanucleus-api-jdo-3.2.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/datanucleus-core-3.2.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/datanucleus-rdbms-3.2.9.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/derby-10.12.1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/flatbuffers-java-1.9.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/generex-1.0.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/gmetric4j-1.0.7.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/gson-2.2.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/guava-14.0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/guice-3.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-annotations-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-auth-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-aws-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-azure-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-client-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-common-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-hdfs-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-app-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-common-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-core-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-jobclient-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-shuffle-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-openstack-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-api-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-client-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-common-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-server-common-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-server-web-proxy-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hive-beeline-1.2.1.spark2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hive-cli-1.2.1.spark2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hive-exec-1.2.1.spark2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hive-jdbc-1.2.1.spark2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hive-metastore-1.2.1.spark2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hk2-api-2.5.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hk2-locator-2.5.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hk2-utils-2.5.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hppc-0.7.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/htrace-core-3.1.0-incubating.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/httpclient-4.5.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/httpcore-4.4.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/istack-commons-runtime-3.0.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/ivy-2.4.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-annotations-2.9.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-core-2.9.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-core-asl-1.9.13.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-databind-2.9.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-dataformat-cbor-2.9.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-dataformat-yaml-2.9.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-jaxrs-1.9.13.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-mapper-asl-1.9.13.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-module-jaxb-annotations-2.9.9.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-module-paranamer-2.9.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-module-scala_2.12-2.9.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-xc-1.9.13.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jakarta.annotation-api-1.3.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jakarta.inject-2.5.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jakarta.ws.rs-api-2.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jakarta.xml.bind-api-2.3.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/janino-3.0.15.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javassist-3.22.0-CR2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javax.el-3.0.1-b11.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javax.inject-1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javax.servlet-api-3.1.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javolution-5.5.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jaxb-api-2.2.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jaxb-runtime-2.3.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jcl-over-slf4j-1.7.16.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jdo-api-3.0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-client-2.29.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-common-2.29.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-container-servlet-2.29.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-container-servlet-core-2.29.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-hk2-2.29.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-media-jaxb-2.29.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-server-2.29.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jettison-1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-6.1.26.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-client-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-continuation-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-http-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-io-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-jndi-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-plus-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-proxy-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-security-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-server-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-servlet-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-servlets-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-sslengine-6.1.26.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-util-6.1.26.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-util-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-webapp-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-xml-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jline-2.14.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/joda-time-2.9.9.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jodd-core-3.5.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jpam-1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/json4s-ast_2.12-3.6.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/json4s-core_2.12-3.6.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/json4s-jackson_2.12-3.6.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/json4s-scalap_2.12-3.6.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jsp-api-2.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jsr305-3.0.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jta-1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jul-to-slf4j-1.7.26.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/kryo-shaded-4.0.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/kubernetes-client-4.4.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/kubernetes-model-4.4.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/kubernetes-model-common-4.4.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/leveldbjni-all-1.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/libfb303-0.9.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/libthrift-0.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/log4j-1.2.17.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/logging-interceptor-3.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/lz4-java-1.6.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/machinist_2.12-0.6.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/macro-compat_2.12-1.1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/mesos-1.4.0-shaded-protobuf.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/metrics-core-3.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/metrics-ganglia-3.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/metrics-graphite-3.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/metrics-json-3.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/metrics-jvm-3.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/minlog-1.3.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/netty-all-4.1.42.Final.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/objenesis-2.5.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/okapi-shade-0.4.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/okhttp-3.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/okio-1.15.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/oncrpc-1.0.7.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/opencsv-2.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/orc-core-1.5.6-nohive.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/orc-mapreduce-1.5.6-nohive.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/orc-shims-1.5.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/oro-2.0.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/osgi-resource-locator-1.0.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/paranamer-2.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-column-1.10.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-common-1.10.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-encoding-1.10.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-format-2.4.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-hadoop-1.10.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-hadoop-bundle-1.6.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-jackson-1.10.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/pmml-model-1.4.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/protobuf-java-2.5.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/py4j-0.10.8.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/pyrolite-4.30.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-collection-compat_2.12-2.1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-compiler-2.12.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-library-2.12.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-parser-combinators_2.12-1.1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-reflect-2.12.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-xml_2.12-1.2.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/shapeless_2.12-2.3.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/shims-0.7.45.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/slf4j-api-1.7.26.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/slf4j-log4j12-1.7.25.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/snakeyaml-1.23.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/snappy-0.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/snappy-java-1.1.7.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-assembly_2.12-3.0.0-SNAPSHOT-tests.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-assembly_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-catalyst_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-core_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-cypher_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-ganglia-lgpl_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-graph-api_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-graph_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-graphx_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-hadoop-cloud_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-hive-thriftserver_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-hive_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-kubernetes_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-kvstore_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-launcher_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-mesos_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-mllib-local_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-mllib_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-network-common_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-network-shuffle_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-repl_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-sketch_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-sql_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-streaming_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-tags_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-unsafe_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-yarn_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spire-macros_2.12-0.17.0-M1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spire-platform_2.12-0.17.0-M1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spire-util_2.12-0.17.0-M1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spire_2.12-0.17.0-M1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spotbugs-annotations-3.1.9.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/stax-api-1.0-2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/stax-api-1.0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/stream-2.9.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/stringtemplate-3.2.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/super-csv-2.2.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/univocity-parsers-2.7.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/unused-1.0.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/validation-api-2.0.1.Final.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/xbean-asm7-shaded-4.14.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/xercesImpl-2.9.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/xml-apis-1.3.04.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/xmlenc-0.52.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/xz-1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/zjsonpatch-0.3.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/zookeeper-3.4.14.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/zstd-jni-1.4.3-1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/licenses/LICENSE-AnchorJS.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-CC0.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-bootstrap.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-cloudpickle.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-copybutton.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-d3.min.js.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-dagre-d3.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-datatables.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-graphlib-dot.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-heapq.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-join.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-jquery.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-json-formatter.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-matchMedia-polyfill.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-modernizr.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-mustache.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-py4j.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-respond.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-sbt-launch-lib.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-sorttable.js.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-vis.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/sbin/spark-config.sh -> pyspark-3.0.0.dev0/deps/sbin
copying deps/sbin/spark-daemon.sh -> pyspark-3.0.0.dev0/deps/sbin
copying deps/sbin/start-history-server.sh -> pyspark-3.0.0.dev0/deps/sbin
copying deps/sbin/stop-history-server.sh -> pyspark-3.0.0.dev0/deps/sbin
copying lib/py4j-0.10.8.1-src.zip -> pyspark-3.0.0.dev0/lib
copying lib/pyspark.zip -> pyspark-3.0.0.dev0/lib
copying pyspark/__init__.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/_globals.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/accumulators.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/broadcast.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/cloudpickle.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/conf.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/context.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/daemon.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/files.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/find_spark_home.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/heapq3.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/java_gateway.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/join.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/profiler.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/rdd.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/rddsampler.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/resourceinformation.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/resultiterable.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/serializers.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/shell.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/shuffle.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/statcounter.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/status.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/storagelevel.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/taskcontext.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/traceback_utils.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/util.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/version.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/worker.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark.egg-info/PKG-INFO -> pyspark-3.0.0.dev0/pyspark.egg-info
copying pyspark.egg-info/SOURCES.txt -> pyspark-3.0.0.dev0/pyspark.egg-info
copying pyspark.egg-info/dependency_links.txt -> pyspark-3.0.0.dev0/pyspark.egg-info
copying pyspark.egg-info/requires.txt -> pyspark-3.0.0.dev0/pyspark.egg-info
copying pyspark.egg-info/top_level.txt -> pyspark-3.0.0.dev0/pyspark.egg-info
copying pyspark/ml/__init__.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/base.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/classification.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/clustering.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/common.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/evaluation.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/feature.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/fpm.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/image.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/pipeline.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/recommendation.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/regression.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/stat.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/tree.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/tuning.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/util.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/wrapper.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/linalg/__init__.py -> pyspark-3.0.0.dev0/pyspark/ml/linalg
copying pyspark/ml/param/__init__.py -> pyspark-3.0.0.dev0/pyspark/ml/param
copying pyspark/ml/param/_shared_params_code_gen.py -> pyspark-3.0.0.dev0/pyspark/ml/param
copying pyspark/ml/param/shared.py -> pyspark-3.0.0.dev0/pyspark/ml/param
copying pyspark/mllib/__init__.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/classification.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/clustering.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/common.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/evaluation.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/feature.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/fpm.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/random.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/recommendation.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/regression.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/tree.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/util.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/linalg/__init__.py -> pyspark-3.0.0.dev0/pyspark/mllib/linalg
copying pyspark/mllib/linalg/distributed.py -> pyspark-3.0.0.dev0/pyspark/mllib/linalg
copying pyspark/mllib/stat/KernelDensity.py -> pyspark-3.0.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/__init__.py -> pyspark-3.0.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/_statistics.py -> pyspark-3.0.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/distribution.py -> pyspark-3.0.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/test.py -> pyspark-3.0.0.dev0/pyspark/mllib/stat
copying pyspark/python/pyspark/shell.py -> pyspark-3.0.0.dev0/pyspark/python/pyspark
copying pyspark/sql/__init__.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/catalog.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/cogroup.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/column.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/conf.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/context.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/dataframe.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/functions.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/group.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/readwriter.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/session.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/streaming.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/types.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/udf.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/utils.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/window.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/streaming/__init__.py -> pyspark-3.0.0.dev0/pyspark/streaming
copying pyspark/streaming/context.py -> pyspark-3.0.0.dev0/pyspark/streaming
copying pyspark/streaming/dstream.py -> pyspark-3.0.0.dev0/pyspark/streaming
copying pyspark/streaming/kinesis.py -> pyspark-3.0.0.dev0/pyspark/streaming
copying pyspark/streaming/listener.py -> pyspark-3.0.0.dev0/pyspark/streaming
copying pyspark/streaming/util.py -> pyspark-3.0.0.dev0/pyspark/streaming
Writing pyspark-3.0.0.dev0/setup.cfg
Creating tar archive
removing 'pyspark-3.0.0.dev0' (and everything under it)
Installing dist into virtual env
Obtaining file:///home/jenkins/workspace/NewSparkPullRequestBuilder/python
Collecting py4j==0.10.8.1 (from pyspark==3.0.0.dev0)
  Downloading https://files.pythonhosted.org/packages/04/de/2d314a921ef4c20b283e1de94e0780273678caac901564df06b948e4ba9b/py4j-0.10.8.1-py2.py3-none-any.whl (196kB)
mkl-random 1.0.1 requires cython, which is not installed.
Installing collected packages: py4j, pyspark
  Running setup.py develop for pyspark
Successfully installed py4j-0.10.8.1 pyspark
You are using pip version 10.0.1, however version 19.2.3 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
Run basic sanity check on pip installed version with spark-submit
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: target/unit-tests.log (No such file or directory)
	at java.io.FileOutputStream.open0(Native Method)
	at java.io.FileOutputStream.open(FileOutputStream.java:270)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
	at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
	at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
	at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
	at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
	at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
	at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
	at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
	at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
	at org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66)
	at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
	at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45)
	at org.apache.spark.internal.Logging$.org$apache$spark$internal$Logging$$isLog4j12(Logging.scala:217)
	at org.apache.spark.internal.Logging.initializeLogging(Logging.scala:122)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary(Logging.scala:111)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary$(Logging.scala:105)
	at org.apache.spark.deploy.SparkSubmit.initializeLogIfNecessary(SparkSubmit.scala:74)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:82)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:980)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:989)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Successfully ran pip sanity check
Run basic sanity check with import based
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: target/unit-tests.log (No such file or directory)
	at java.io.FileOutputStream.open0(Native Method)
	at java.io.FileOutputStream.open(FileOutputStream.java:270)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
	at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
	at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
	at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
	at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
	at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
	at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
	at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
	at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
	at org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66)
	at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
	at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45)
	at org.apache.spark.internal.Logging$.org$apache$spark$internal$Logging$$isLog4j12(Logging.scala:217)
	at org.apache.spark.internal.Logging.initializeLogging(Logging.scala:122)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary(Logging.scala:111)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary$(Logging.scala:105)
	at org.apache.spark.deploy.SparkSubmit.initializeLogIfNecessary(SparkSubmit.scala:74)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:82)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:980)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:989)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

[Stage 0:>                                                        (0 + 10) / 10]
                                                                                
Successfully ran pip sanity check
Run the tests for context.py
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: target/unit-tests.log (No such file or directory)
	at java.io.FileOutputStream.open0(Native Method)
	at java.io.FileOutputStream.open(FileOutputStream.java:270)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
	at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
	at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
	at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
	at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
	at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
	at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
	at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
	at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
	at org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66)
	at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
	at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45)
	at org.apache.spark.internal.Logging$.org$apache$spark$internal$Logging$$isLog4j12(Logging.scala:217)
	at org.apache.spark.internal.Logging.initializeLogging(Logging.scala:122)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary(Logging.scala:111)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary$(Logging.scala:105)
	at org.apache.spark.deploy.SparkSubmit.initializeLogIfNecessary(SparkSubmit.scala:74)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:82)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:980)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:989)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: target/unit-tests.log (No such file or directory)
	at java.io.FileOutputStream.open0(Native Method)
	at java.io.FileOutputStream.open(FileOutputStream.java:270)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
	at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
	at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
	at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
	at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
	at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
	at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
	at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
	at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
	at org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66)
	at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
	at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45)
	at org.apache.spark.internal.Logging$.org$apache$spark$internal$Logging$$isLog4j12(Logging.scala:217)
	at org.apache.spark.internal.Logging.initializeLogging(Logging.scala:122)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary(Logging.scala:111)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary$(Logging.scala:105)
	at org.apache.spark.deploy.SparkSubmit.initializeLogIfNecessary(SparkSubmit.scala:74)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:82)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:980)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:989)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

[Stage 0:>                                                          (0 + 4) / 4]
                                                                                

[Stage 10:>                                                         (0 + 4) / 4]
[Stage 10:>                 (0 + 4) / 4][Stage 11:>                 (0 + 0) / 2]
                                                                                
Cleaning up temporary directory - /tmp/tmp.6dJoIguyJJ

========================================================================
Running SparkR tests
========================================================================
During startup - Warning message:
In .First() :
  Support for R prior to version 3.4 is deprecated since Spark 3.0.0
Loading required package: methods

Attaching package: ���SparkR���

The following objects are masked from ���package:testthat���:

    describe, not

The following objects are masked from ���package:stats���:

    cov, filter, lag, na.omit, predict, sd, var, window

The following objects are masked from ���package:base���:

    as.data.frame, colnames, colnames<-, drop, intersect, rank, rbind,
    sample, subset, summary, transform, union

Spark package found in SPARK_HOME: /home/jenkins/workspace/NewSparkPullRequestBuilder
basic tests for CRAN: .............

DONE ===========================================================================
binary functions: Spark package found in SPARK_HOME: /home/jenkins/workspace/NewSparkPullRequestBuilder
...........
functions on binary files: Spark package found in SPARK_HOME: /home/jenkins/workspace/NewSparkPullRequestBuilder
....
broadcast variables: Spark package found in SPARK_HOME: /home/jenkins/workspace/NewSparkPullRequestBuilder
..
functions in client.R: .....
test functions in sparkR.R: ..........................................
include R packages: Spark package found in SPARK_HOME: /home/jenkins/workspace/NewSparkPullRequestBuilder

JVM API: Spark package found in SPARK_HOME: /home/jenkins/workspace/NewSparkPullRequestBuilder
..
MLlib classification algorithms, except for tree-based algorithms: Spark package found in SPARK_HOME: /home/jenkins/workspace/NewSparkPullRequestBuilder
......................................................................
MLlib clustering algorithms: Spark package found in SPARK_HOME: /home/jenkins/workspace/NewSparkPullRequestBuilder
......................................................................
MLlib frequent pattern mining: Spark package found in SPARK_HOME: /home/jenkins/workspace/NewSparkPullRequestBuilder
......
MLlib recommendation algorithms: Spark package found in SPARK_HOME: /home/jenkins/workspace/NewSparkPullRequestBuilder
........
MLlib regression algorithms, except for tree-based algorithms: Spark package found in SPARK_HOME: /home/jenkins/workspace/NewSparkPullRequestBuilder
................................................................................................................................
MLlib statistics algorithms: Spark package found in SPARK_HOME: /home/jenkins/workspace/NewSparkPullRequestBuilder
........
MLlib tree-based algorithms: Spark package found in SPARK_HOME: /home/jenkins/workspace/NewSparkPullRequestBuilder
..............................................................................................
parallelize() and collect(): Spark package found in SPARK_HOME: /home/jenkins/workspace/NewSparkPullRequestBuilder
.............................
basic RDD functions: Spark package found in SPARK_HOME: /home/jenkins/workspace/NewSparkPullRequestBuilder
............................................................................................................................................................................................................................................................................................................................................................................................................................................
SerDe functionality: Spark package found in SPARK_HOME: /home/jenkins/workspace/NewSparkPullRequestBuilder
.......................................
partitionBy, groupByKey, reduceByKey etc.: Spark package found in SPARK_HOME: /home/jenkins/workspace/NewSparkPullRequestBuilder
....................
functions in sparkR.R: ....
SparkSQL Arrow optimization: Spark package found in SPARK_HOME: /home/jenkins/workspace/NewSparkPullRequestBuilder
SSSSSSSSSS
test show SparkDataFrame when eager execution is enabled.: ......
SparkSQL functions: Spark package found in SPARK_HOME: /home/jenkins/workspace/NewSparkPullRequestBuilder
................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................
Structured Streaming: Spark package found in SPARK_HOME: /home/jenkins/workspace/NewSparkPullRequestBuilder
..........................................
tests RDD function take(): Spark package found in SPARK_HOME: /home/jenkins/workspace/NewSparkPullRequestBuilder
................
the textFile() function: Spark package found in SPARK_HOME: /home/jenkins/workspace/NewSparkPullRequestBuilder
.............
functions in utils.R: Spark package found in SPARK_HOME: /home/jenkins/workspace/NewSparkPullRequestBuilder
.............................................
Windows-specific tests: S

Skipped ------------------------------------------------------------------------
1. createDataFrame/collect Arrow optimization (@test_sparkSQL_arrow.R#25) - arrow not installed

2. createDataFrame/collect Arrow optimization - many partitions (partition order test) (@test_sparkSQL_arrow.R#48) - arrow not installed

3. createDataFrame/collect Arrow optimization - type specification (@test_sparkSQL_arrow.R#64) - arrow not installed

4. dapply() Arrow optimization (@test_sparkSQL_arrow.R#94) - arrow not installed

5. dapply() Arrow optimization - type specification (@test_sparkSQL_arrow.R#134) - arrow not installed

6. dapply() Arrow optimization - type specification (date and timestamp) (@test_sparkSQL_arrow.R#169) - arrow not installed

7. gapply() Arrow optimization (@test_sparkSQL_arrow.R#188) - arrow not installed

8. gapply() Arrow optimization - type specification (@test_sparkSQL_arrow.R#237) - arrow not installed

9. gapply() Arrow optimization - type specification (date and timestamp) (@test_sparkSQL_arrow.R#276) - arrow not installed

10. Arrow optimization - unsupported types (@test_sparkSQL_arrow.R#297) - arrow not installed

11. sparkJars tag in SparkContext (@test_Windows.R#22) - This test is only for Windows, skipped

DONE ===========================================================================
Using R_SCRIPT_PATH = /usr/bin
++++ dirname /home/jenkins/workspace/NewSparkPullRequestBuilder/R/install-dev.sh
+++ cd /home/jenkins/workspace/NewSparkPullRequestBuilder/R
+++ pwd
++ FWDIR=/home/jenkins/workspace/NewSparkPullRequestBuilder/R
++ LIB_DIR=/home/jenkins/workspace/NewSparkPullRequestBuilder/R/lib
++ mkdir -p /home/jenkins/workspace/NewSparkPullRequestBuilder/R/lib
++ pushd /home/jenkins/workspace/NewSparkPullRequestBuilder/R
++ . /home/jenkins/workspace/NewSparkPullRequestBuilder/R/find-r.sh
+++ '[' -z /usr/bin ']'
++ . /home/jenkins/workspace/NewSparkPullRequestBuilder/R/create-rd.sh
+++ set -o pipefail
+++ set -e
+++++ dirname /home/jenkins/workspace/NewSparkPullRequestBuilder/R/create-rd.sh
++++ cd /home/jenkins/workspace/NewSparkPullRequestBuilder/R
++++ pwd
+++ FWDIR=/home/jenkins/workspace/NewSparkPullRequestBuilder/R
+++ pushd /home/jenkins/workspace/NewSparkPullRequestBuilder/R
+++ . /home/jenkins/workspace/NewSparkPullRequestBuilder/R/find-r.sh
++++ '[' -z /usr/bin ']'
+++ /usr/bin/Rscript -e ' if("devtools" %in% rownames(installed.packages())) { library(devtools); devtools::document(pkg="./pkg", roclets=c("rd")) }'
Updating SparkR documentation
Loading SparkR
Loading required package: methods
Creating a new generic function for ���as.data.frame��� in package ���SparkR���
Creating a new generic function for ���colnames��� in package ���SparkR���
Creating a new generic function for ���colnames<-��� in package ���SparkR���
Creating a new generic function for ���cov��� in package ���SparkR���
Creating a new generic function for ���drop��� in package ���SparkR���
Creating a new generic function for ���na.omit��� in package ���SparkR���
Creating a new generic function for ���filter��� in package ���SparkR���
Creating a new generic function for ���intersect��� in package ���SparkR���
Creating a new generic function for ���sample��� in package ���SparkR���
Creating a new generic function for ���transform��� in package ���SparkR���
Creating a new generic function for ���subset��� in package ���SparkR���
Creating a new generic function for ���summary��� in package ���SparkR���
Creating a new generic function for ���union��� in package ���SparkR���
Creating a new generic function for ���lag��� in package ���SparkR���
Creating a new generic function for ���rank��� in package ���SparkR���
Creating a new generic function for ���sd��� in package ���SparkR���
Creating a new generic function for ���var��� in package ���SparkR���
Creating a new generic function for ���window��� in package ���SparkR���
Creating a new generic function for ���predict��� in package ���SparkR���
Creating a new generic function for ���rbind��� in package ���SparkR���
Creating a generic function for ���alias��� from package ���stats��� in package ���SparkR���
Creating a generic function for ���substr��� from package ���base��� in package ���SparkR���
Creating a generic function for ���%in%��� from package ���base��� in package ���SparkR���
Creating a generic function for ���mean��� from package ���base��� in package ���SparkR���
Creating a generic function for ���lapply��� from package ���base��� in package ���SparkR���
Creating a generic function for ���Filter��� from package ���base��� in package ���SparkR���
Creating a generic function for ���unique��� from package ���base��� in package ���SparkR���
Creating a generic function for ���nrow��� from package ���base��� in package ���SparkR���
Creating a generic function for ���ncol��� from package ���base��� in package ���SparkR���
Creating a generic function for ���head��� from package ���utils��� in package ���SparkR���
Creating a generic function for ���factorial��� from package ���base��� in package ���SparkR���
Creating a generic function for ���atan2��� from package ���base��� in package ���SparkR���
Creating a generic function for ���ifelse��� from package ���base��� in package ���SparkR���
++ /usr/bin/R CMD INSTALL --library=/home/jenkins/workspace/NewSparkPullRequestBuilder/R/lib /home/jenkins/workspace/NewSparkPullRequestBuilder/R/pkg/
* installing *source* package ���SparkR��� ...
** R
** inst
** preparing package for lazy loading
Creating a new generic function for ���as.data.frame��� in package ���SparkR���
Creating a new generic function for ���colnames��� in package ���SparkR���
Creating a new generic function for ���colnames<-��� in package ���SparkR���
Creating a new generic function for ���cov��� in package ���SparkR���
Creating a new generic function for ���drop��� in package ���SparkR���
Creating a new generic function for ���na.omit��� in package ���SparkR���
Creating a new generic function for ���filter��� in package ���SparkR���
Creating a new generic function for ���intersect��� in package ���SparkR���
Creating a new generic function for ���sample��� in package ���SparkR���
Creating a new generic function for ���transform��� in package ���SparkR���
Creating a new generic function for ���subset��� in package ���SparkR���
Creating a new generic function for ���summary��� in package ���SparkR���
Creating a new generic function for ���union��� in package ���SparkR���
Creating a new generic function for ���lag��� in package ���SparkR���
Creating a new generic function for ���rank��� in package ���SparkR���
Creating a new generic function for ���sd��� in package ���SparkR���
Creating a new generic function for ���var��� in package ���SparkR���
Creating a new generic function for ���window��� in package ���SparkR���
Creating a new generic function for ���predict��� in package ���SparkR���
Creating a new generic function for ���rbind��� in package ���SparkR���
Creating a generic function for ���alias��� from package ���stats��� in package ���SparkR���
Creating a generic function for ���substr��� from package ���base��� in package ���SparkR���
Creating a generic function for ���%in%��� from package ���base��� in package ���SparkR���
Creating a generic function for ���mean��� from package ���base��� in package ���SparkR���
Creating a generic function for ���lapply��� from package ���base��� in package ���SparkR���
Creating a generic function for ���Filter��� from package ���base��� in package ���SparkR���
Creating a generic function for ���unique��� from package ���base��� in package ���SparkR���
Creating a generic function for ���nrow��� from package ���base��� in package ���SparkR���
Creating a generic function for ���ncol��� from package ���base��� in package ���SparkR���
Creating a generic function for ���head��� from package ���utils��� in package ���SparkR���
Creating a generic function for ���factorial��� from package ���base��� in package ���SparkR���
Creating a generic function for ���atan2��� from package ���base��� in package ���SparkR���
Creating a generic function for ���ifelse��� from package ���base��� in package ���SparkR���
** help
*** installing help indices
  converting help for package ���SparkR���
    finding HTML links ... done
    AFTSurvivalRegressionModel-class        html  
    ALSModel-class                          html  
    BisectingKMeansModel-class              html  
    DecisionTreeClassificationModel-class   html  
    DecisionTreeRegressionModel-class       html  
    FPGrowthModel-class                     html  
    GBTClassificationModel-class            html  
    GBTRegressionModel-class                html  
    GaussianMixtureModel-class              html  
    GeneralizedLinearRegressionModel-class
                                            html  
    GroupedData                             html  
    IsotonicRegressionModel-class           html  
    KMeansModel-class                       html  
    KSTest-class                            html  
    LDAModel-class                          html  
    LinearSVCModel-class                    html  
    LogisticRegressionModel-class           html  
    MultilayerPerceptronClassificationModel-class
                                            html  
    NaiveBayesModel-class                   html  
    PowerIterationClustering-class          html  
    PrefixSpan-class                        html  
    RandomForestClassificationModel-class   html  
    RandomForestRegressionModel-class       html  
    SparkDataFrame                          html  
    StreamingQuery                          html  
    WindowSpec                              html  
    alias                                   html  
    approxQuantile                          html  
    arrange                                 html  
    as.data.frame                           html  
    attach                                  html  
    avg                                     html  
    awaitTermination                        html  
    between                                 html  
    broadcast                               html  
    cache                                   html  
    cacheTable                              html  
    cancelJobGroup                          html  
    cast                                    html  
    checkpoint                              html  
    clearCache                              html  
    clearJobGroup                           html  
    coalesce                                html  
    collect                                 html  
    coltypes                                html  
    column                                  html  
    column_aggregate_functions              html  
    column_collection_functions             html  
    column_datetime_diff_functions          html  
    column_datetime_functions               html  
    column_math_functions                   html  
    column_misc_functions                   html  
    column_nonaggregate_functions           html  
    column_string_functions                 html  
    column_window_functions                 html  
    columnfunctions                         html  
    columns                                 html  
    corr                                    html  
    count                                   html  
    cov                                     html  
    createDataFrame                         html  
    createOrReplaceTempView                 html  
    createTable                             html  
    crossJoin                               html  
    crosstab                                html  
    cube                                    html  
    currentDatabase                         html  
    dapply                                  html  
    dapplyCollect                           html  
    describe                                html  
    dim                                     html  
    distinct                                html  
    drop                                    html  
    dropDuplicates                          html  
    dropTempView                            html  
    dtypes                                  html  
    endsWith                                html  
    eq_null_safe                            html  
    except                                  html  
    exceptAll                               html  
    explain                                 html  
    filter                                  html  
    first                                   html  
    fitted                                  html  
    freqItems                               html  
    gapply                                  html  
    gapplyCollect                           html  
    getLocalProperty                        html  
    getNumPartitions                        html  
    glm                                     html  
    groupBy                                 html  
    hashCode                                html  
    head                                    html  
    hint                                    html  
    histogram                               html  
    insertInto                              html  
    install.spark                           html  
    intersect                               html  
    intersectAll                            html  
    isActive                                html  
    isLocal                                 html  
    isStreaming                             html  
    join                                    html  
    last                                    html  
    lastProgress                            html  
    limit                                   html  
    listColumns                             html  
    listDatabases                           html  
    listFunctions                           html  
    listTables                              html  
    localCheckpoint                         html  
    match                                   html  
    merge                                   html  
    mutate                                  html  
    nafunctions                             html  
    ncol                                    html  
    not                                     html  
    nrow                                    html  
    orderBy                                 html  
    otherwise                               html  
    over                                    html  
    partitionBy                             html  
    persist                                 html  
    pivot                                   html  
    predict                                 html  
    print.jobj                              html  
    print.structField                       html  
    print.structType                        html  
    printSchema                             html  
    queryName                               html  
    randomSplit                             html  
    rangeBetween                            html  
    rbind                                   html  
    read.df                                 html  
    read.jdbc                               html  
    read.json                               html  
    read.ml                                 html  
    read.orc                                html  
    read.parquet                            html  
    read.stream                             html  
    read.text                               html  
    recoverPartitions                       html  
    refreshByPath                           html  
    refreshTable                            html  
    rename                                  html  
    repartition                             html  
    repartitionByRange                      html  
    rollup                                  html  
    rowsBetween                             html  
    sample                                  html  
    sampleBy                                html  
    saveAsTable                             html  
    schema                                  html  
    select                                  html  
    selectExpr                              html  
    setCheckpointDir                        html  
    setCurrentDatabase                      html  
    setJobDescription                       html  
    setJobGroup                             html  
    setLocalProperty                        html  
    setLogLevel                             html  
    show                                    html  
    showDF                                  html  
    spark.addFile                           html  
    spark.als                               html  
    spark.bisectingKmeans                   html  
    spark.decisionTree                      html  
    spark.fpGrowth                          html  
    spark.gaussianMixture                   html  
    spark.gbt                               html  
    spark.getSparkFiles                     html  
    spark.getSparkFilesRootDirectory        html  
    spark.glm                               html  
    spark.isoreg                            html  
    spark.kmeans                            html  
    spark.kstest                            html  
    spark.lapply                            html  
    spark.lda                               html  
    spark.logit                             html  
    spark.mlp                               html  
    spark.naiveBayes                        html  
    spark.powerIterationClustering          html  
    spark.prefixSpan                        html  
    spark.randomForest                      html  
    spark.survreg                           html  
    spark.svmLinear                         html  
    sparkR.callJMethod                      html  
    sparkR.callJStatic                      html  
    sparkR.conf                             html  
    sparkR.newJObject                       html  
    sparkR.session                          html  
    sparkR.session.stop                     html  
    sparkR.uiWebUrl                         html  
    sparkR.version                          html  
    sql                                     html  
    startsWith                              html  
    status                                  html  
    stopQuery                               html  
    storageLevel                            html  
    str                                     html  
    structField                             html  
    structType                              html  
    subset                                  html  
    substr                                  html  
    summarize                               html  
    summary                                 html  
    tableNames                              html  
    tableToDF                               html  
    tables                                  html  
    take                                    html  
    toJSON                                  html  
    uncacheTable                            html  
    union                                   html  
    unionAll                                html  
    unionByName                             html  
    unpersist                               html  
    windowOrderBy                           html  
    windowPartitionBy                       html  
    with                                    html  
    withColumn                              html  
    withWatermark                           html  
    write.df                                html  
    write.jdbc                              html  
    write.json                              html  
    write.ml                                html  
    write.orc                               html  
    write.parquet                           html  
    write.stream                            html  
    write.text                              html  
** building package indices
** installing vignettes
** testing if installed package can be loaded
* DONE (SparkR)
++ cd /home/jenkins/workspace/NewSparkPullRequestBuilder/R/lib
++ jar cfM /home/jenkins/workspace/NewSparkPullRequestBuilder/R/lib/sparkr.zip SparkR
++ popd
++ cd /home/jenkins/workspace/NewSparkPullRequestBuilder/R/..
++ pwd
+ SPARK_HOME=/home/jenkins/workspace/NewSparkPullRequestBuilder
+ . /home/jenkins/workspace/NewSparkPullRequestBuilder/bin/load-spark-env.sh
++ '[' -z /home/jenkins/workspace/NewSparkPullRequestBuilder ']'
++ SPARK_ENV_SH=spark-env.sh
++ '[' -z '' ']'
++ export SPARK_ENV_LOADED=1
++ SPARK_ENV_LOADED=1
++ export SPARK_CONF_DIR=/home/jenkins/workspace/NewSparkPullRequestBuilder/conf
++ SPARK_CONF_DIR=/home/jenkins/workspace/NewSparkPullRequestBuilder/conf
++ SPARK_ENV_SH=/home/jenkins/workspace/NewSparkPullRequestBuilder/conf/spark-env.sh
++ [[ -f /home/jenkins/workspace/NewSparkPullRequestBuilder/conf/spark-env.sh ]]
++ export SPARK_SCALA_VERSION=2.12
++ SPARK_SCALA_VERSION=2.12
+ '[' -f /home/jenkins/workspace/NewSparkPullRequestBuilder/RELEASE ']'
+ SPARK_JARS_DIR=/home/jenkins/workspace/NewSparkPullRequestBuilder/assembly/target/scala-2.12/jars
+ '[' -d /home/jenkins/workspace/NewSparkPullRequestBuilder/assembly/target/scala-2.12/jars ']'
+ SPARK_HOME=/home/jenkins/workspace/NewSparkPullRequestBuilder
+ /usr/bin/R CMD build /home/jenkins/workspace/NewSparkPullRequestBuilder/R/pkg
* checking for file ���/home/jenkins/workspace/NewSparkPullRequestBuilder/R/pkg/DESCRIPTION��� ... OK
* preparing ���SparkR���:
* checking DESCRIPTION meta-information ... OK
* installing the package to build vignettes
* creating vignettes ... OK
* checking for LF line-endings in source and make files
* checking for empty or unneeded directories
* building ���SparkR_3.0.0.tar.gz���

+ find pkg/vignettes/. -not -name . -not -name '*.Rmd' -not -name '*.md' -not -name '*.pdf' -not -name '*.html' -delete
++ grep Version /home/jenkins/workspace/NewSparkPullRequestBuilder/R/pkg/DESCRIPTION
++ awk '{print $NF}'
+ VERSION=3.0.0
+ CRAN_CHECK_OPTIONS=--as-cran
+ '[' -n 1 ']'
+ CRAN_CHECK_OPTIONS='--as-cran --no-tests'
+ '[' -n 1 ']'
+ CRAN_CHECK_OPTIONS='--as-cran --no-tests --no-manual --no-vignettes'
+ echo 'Running CRAN check with --as-cran --no-tests --no-manual --no-vignettes options'
Running CRAN check with --as-cran --no-tests --no-manual --no-vignettes options
+ export _R_CHECK_FORCE_SUGGESTS_=FALSE
+ _R_CHECK_FORCE_SUGGESTS_=FALSE
+ '[' -n 1 ']'
+ '[' -n 1 ']'
+ /usr/bin/R CMD check --as-cran --no-tests --no-manual --no-vignettes SparkR_3.0.0.tar.gz
* using log directory ���/home/jenkins/workspace/NewSparkPullRequestBuilder/R/SparkR.Rcheck���
* using R version 3.1.1 (2014-07-10)
* using platform: x86_64-redhat-linux-gnu (64-bit)
* using session charset: UTF-8
* using options ���--no-tests --no-vignettes���
* checking for file ���SparkR/DESCRIPTION��� ... OK
* checking extension type ... Package
* this is package ���SparkR��� version ���3.0.0���
* checking CRAN incoming feasibility ... NOTE
Maintainer: ���Shivaram Venkataraman <shivaram@cs.berkeley.edu>���
Unknown, possibly mis-spelled, fields in DESCRIPTION:
  ���RoxygenNote���
CRAN repository db overrides:
  X-CRAN-History: Archived on 2017-10-22 for policy violation.
    Unarchived on 2018-03-03. Archived on 2018-05-01 as check problems
    were not corrected despite reminders. Unarchived on 2019-04-04.
* checking package namespace information ... OK
* checking package dependencies ... NOTE
  No repository set, so cyclic dependency check skipped

Package suggested but not available for checking: ���arrow���
* checking if this is a source package ... OK
* checking if there is a namespace ... OK
* checking for executable files ... OK
* checking for hidden files and directories ... OK
* checking for portable file names ... OK
* checking for sufficient/correct file permissions ... OK
* checking whether package ���SparkR��� can be installed ... OK
* checking installed package size ... OK
* checking package directory ... OK
* checking ���build��� directory ... OK
* checking DESCRIPTION meta-information ... OK
* checking top-level files ... OK
* checking for left-over files ... OK
* checking index information ... OK
* checking package subdirectories ... OK
* checking R files for non-ASCII characters ... OK
* checking R files for syntax errors ... OK
* checking whether the package can be loaded ... OK
* checking whether the package can be loaded with stated dependencies ... OK
* checking whether the package can be unloaded cleanly ... OK
* checking whether the namespace can be loaded with stated dependencies ... OK
* checking whether the namespace can be unloaded cleanly ... OK
* checking loading without being on the library search path ... OK
* checking dependencies in R code ... OK
* checking S3 generic/method consistency ... OK
* checking replacement functions ... OK
* checking foreign function calls ... OK
* checking R code for possible problems ... OK
* checking Rd files ... OK
* checking Rd metadata ... OK
* checking Rd line widths ... OK
* checking Rd cross-references ... OK
* checking for missing documentation entries ... OK
* checking for code/documentation mismatches ... OK
* checking Rd \usage sections ... OK
* checking Rd contents ... OK
* checking for unstated dependencies in examples ... OK
* checking installed files from ���inst/doc��� ... OK
* checking files in ���vignettes��� ... OK
* checking examples ... OK
* checking for unstated dependencies in tests ... OK
* checking tests ... SKIPPED
* checking for unstated dependencies in vignettes ... OK
* checking package vignettes in ���inst/doc��� ... OK
* checking running R code from vignettes ... SKIPPED
* checking re-building of vignette outputs ... SKIPPED

NOTE: There were 2 notes.
See
  ���/home/jenkins/workspace/NewSparkPullRequestBuilder/R/SparkR.Rcheck/00check.log���
for details.

+ popd
Tests passed.
Attempting to post to Github...
 > Post successful.
Archiving artifacts
Recording test results
Finished: SUCCESS