SuccessConsole Output

Skipping 15,182 KB.. Full Log
md -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/spark-class2.cmd -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/spark-shell -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/spark-shell.cmd -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/spark-shell2.cmd -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/spark-sql -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/spark-sql.cmd -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/spark-sql2.cmd -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/spark-submit -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/spark-submit.cmd -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/spark-submit2.cmd -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/sparkR -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/sparkR.cmd -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/sparkR2.cmd -> pyspark-2.3.0.dev0/deps/bin
copying deps/data/graphx/followers.txt -> pyspark-2.3.0.dev0/deps/data/graphx
copying deps/data/graphx/users.txt -> pyspark-2.3.0.dev0/deps/data/graphx
copying deps/data/mllib/gmm_data.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/iris_libsvm.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/kmeans_data.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/pagerank_data.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/pic_data.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_binary_classification_data.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_fpgrowth.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_isotonic_regression_libsvm_data.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_kmeans_data.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_lda_data.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_lda_libsvm_data.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_libsvm_data.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_linear_regression_data.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_movielens_data.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_multiclass_classification_data.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_svm_data.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/streaming_kmeans_data_test.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/als/sample_movielens_ratings.txt -> pyspark-2.3.0.dev0/deps/data/mllib/als
copying deps/data/mllib/als/test.data -> pyspark-2.3.0.dev0/deps/data/mllib/als
copying deps/data/mllib/images/license.txt -> pyspark-2.3.0.dev0/deps/data/mllib/images
copying deps/data/mllib/images/kittens/not-image.txt -> pyspark-2.3.0.dev0/deps/data/mllib/images/kittens
copying deps/data/mllib/ridge-data/lpsa.data -> pyspark-2.3.0.dev0/deps/data/mllib/ridge-data
copying deps/data/streaming/AFINN-111.txt -> pyspark-2.3.0.dev0/deps/data/streaming
copying deps/examples/als.py -> pyspark-2.3.0.dev0/deps/examples
copying deps/examples/avro_inputformat.py -> pyspark-2.3.0.dev0/deps/examples
copying deps/examples/kmeans.py -> pyspark-2.3.0.dev0/deps/examples
copying deps/examples/logistic_regression.py -> pyspark-2.3.0.dev0/deps/examples
copying deps/examples/pagerank.py -> pyspark-2.3.0.dev0/deps/examples
copying deps/examples/parquet_inputformat.py -> pyspark-2.3.0.dev0/deps/examples
copying deps/examples/pi.py -> pyspark-2.3.0.dev0/deps/examples
copying deps/examples/sort.py -> pyspark-2.3.0.dev0/deps/examples
copying deps/examples/status_api_demo.py -> pyspark-2.3.0.dev0/deps/examples
copying deps/examples/transitive_closure.py -> pyspark-2.3.0.dev0/deps/examples
copying deps/examples/wordcount.py -> pyspark-2.3.0.dev0/deps/examples
copying deps/examples/ml/aft_survival_regression.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/als_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/binarizer_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/bisecting_k_means_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/bucketed_random_projection_lsh_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/bucketizer_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/chi_square_test_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/chisq_selector_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/correlation_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/count_vectorizer_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/cross_validator.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/dataframe_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/dct_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/decision_tree_classification_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/decision_tree_regression_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/elementwise_product_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/estimator_transformer_param_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/feature_hasher_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/fpgrowth_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/gaussian_mixture_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/generalized_linear_regression_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/gradient_boosted_tree_classifier_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/gradient_boosted_tree_regressor_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/imputer_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/index_to_string_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/isotonic_regression_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/kmeans_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/lda_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/linear_regression_with_elastic_net.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/linearsvc.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/logistic_regression_summary_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/logistic_regression_with_elastic_net.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/max_abs_scaler_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/min_hash_lsh_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/min_max_scaler_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/multiclass_logistic_regression_with_elastic_net.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/multilayer_perceptron_classification.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/n_gram_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/naive_bayes_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/normalizer_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/one_vs_rest_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/onehot_encoder_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/pca_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/pipeline_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/polynomial_expansion_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/quantile_discretizer_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/random_forest_classifier_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/random_forest_regressor_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/rformula_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/sql_transformer.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/standard_scaler_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/stopwords_remover_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/string_indexer_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/tf_idf_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/tokenizer_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/train_validation_split.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_assembler_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_indexer_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_slicer_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/word2vec_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/mllib/binary_classification_metrics_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/bisecting_k_means_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/correlations.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/correlations_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/decision_tree_classification_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/decision_tree_regression_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/elementwise_product_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/fpgrowth_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gaussian_mixture_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gaussian_mixture_model.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gradient_boosting_classification_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gradient_boosting_regression_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/hypothesis_testing_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/hypothesis_testing_kolmogorov_smirnov_test_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/isotonic_regression_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/k_means_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/kernel_density_estimation_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/kmeans.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/latent_dirichlet_allocation_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/linear_regression_with_sgd_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/logistic_regression.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/logistic_regression_with_lbfgs_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/multi_class_metrics_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/multi_label_metrics_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/naive_bayes_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/normalizer_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/pca_rowmatrix_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/power_iteration_clustering_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/random_forest_classification_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/random_forest_regression_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/random_rdd_generation.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/ranking_metrics_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/recommendation_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/regression_metrics_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/sampled_rdds.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/standard_scaler_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/stratified_sampling_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/streaming_k_means_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/streaming_linear_regression_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/summary_statistics_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/svd_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/svm_with_sgd_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/tf_idf_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/word2vec.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/word2vec_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/sql/basic.py -> pyspark-2.3.0.dev0/deps/examples/sql
copying deps/examples/sql/datasource.py -> pyspark-2.3.0.dev0/deps/examples/sql
copying deps/examples/sql/hive.py -> pyspark-2.3.0.dev0/deps/examples/sql
copying deps/examples/sql/streaming/structured_kafka_wordcount.py -> pyspark-2.3.0.dev0/deps/examples/sql/streaming
copying deps/examples/sql/streaming/structured_network_wordcount.py -> pyspark-2.3.0.dev0/deps/examples/sql/streaming
copying deps/examples/sql/streaming/structured_network_wordcount_windowed.py -> pyspark-2.3.0.dev0/deps/examples/sql/streaming
copying deps/examples/streaming/direct_kafka_wordcount.py -> pyspark-2.3.0.dev0/deps/examples/streaming
copying deps/examples/streaming/flume_wordcount.py -> pyspark-2.3.0.dev0/deps/examples/streaming
copying deps/examples/streaming/hdfs_wordcount.py -> pyspark-2.3.0.dev0/deps/examples/streaming
copying deps/examples/streaming/kafka_wordcount.py -> pyspark-2.3.0.dev0/deps/examples/streaming
copying deps/examples/streaming/network_wordcount.py -> pyspark-2.3.0.dev0/deps/examples/streaming
copying deps/examples/streaming/network_wordjoinsentiments.py -> pyspark-2.3.0.dev0/deps/examples/streaming
copying deps/examples/streaming/queue_stream.py -> pyspark-2.3.0.dev0/deps/examples/streaming
copying deps/examples/streaming/recoverable_network_wordcount.py -> pyspark-2.3.0.dev0/deps/examples/streaming
copying deps/examples/streaming/sql_network_wordcount.py -> pyspark-2.3.0.dev0/deps/examples/streaming
copying deps/examples/streaming/stateful_network_wordcount.py -> pyspark-2.3.0.dev0/deps/examples/streaming
copying deps/jars/JavaEWAH-0.3.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/RoaringBitmap-0.5.11.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/ST4-4.0.4.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/activation-1.1.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/aircompressor-0.8.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/antlr-2.7.7.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/antlr-runtime-3.4.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/antlr4-runtime-4.7.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/aopalliance-1.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/aopalliance-repackaged-2.4.0-b34.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/apache-log4j-extras-1.2.17.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/apacheds-i18n-2.0.0-M15.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/apacheds-kerberos-codec-2.0.0-M15.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/api-asn1-api-1.0.0-M20.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/api-util-1.0.0-M20.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/arpack_combined_all-0.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/arrow-format-0.8.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/arrow-memory-0.8.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/arrow-vector-0.8.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/automaton-1.11-8.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/avro-1.7.7.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/avro-ipc-1.7.7-tests.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/avro-ipc-1.7.7.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/avro-mapred-1.7.7-hadoop2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/base64-2.3.8.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/bcprov-jdk15on-1.52.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/bonecp-0.8.0.RELEASE.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/breeze-macros_2.11-0.13.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/breeze_2.11-0.13.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/calcite-avatica-1.2.0-incubating.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/calcite-core-1.2.0-incubating.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/calcite-linq4j-1.2.0-incubating.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/cglib-2.2.1-v20090111.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/chill-java-0.8.4.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/chill_2.11-0.8.4.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-beanutils-1.7.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-beanutils-core-1.8.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-cli-1.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-codec-1.11.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-collections-3.2.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-compiler-3.0.8.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-compress-1.4.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-configuration-1.6.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-crypto-1.0.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-dbcp-1.4.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-digester-1.8.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-httpclient-3.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-io-2.4.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-lang-2.6.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-lang3-3.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-logging-1.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-math3-3.4.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-net-3.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-pool-1.5.4.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/compress-lzf-1.0.3.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/core-1.1.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/curator-client-2.6.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/curator-framework-2.6.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/curator-recipes-2.6.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/datanucleus-api-jdo-3.2.6.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/datanucleus-core-3.2.10.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/datanucleus-rdbms-3.2.9.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/derby-10.12.1.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/eigenbase-properties-1.1.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/flatbuffers-1.2.0-3f79e055.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/generex-1.0.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/gson-2.2.4.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/guava-14.0.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/guice-3.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hadoop-annotations-2.6.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hadoop-auth-2.6.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hadoop-client-2.6.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hadoop-common-2.6.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hadoop-hdfs-2.6.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-app-2.6.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-common-2.6.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-core-2.6.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-jobclient-2.6.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-shuffle-2.6.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-api-2.6.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-client-2.6.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-common-2.6.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-server-common-2.6.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-server-web-proxy-2.6.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hive-beeline-1.2.1.spark2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hive-cli-1.2.1.spark2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hive-exec-1.2.1.spark2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hive-jdbc-1.2.1.spark2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hive-metastore-1.2.1.spark2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hk2-api-2.4.0-b34.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hk2-locator-2.4.0-b34.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hk2-utils-2.4.0-b34.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hppc-0.7.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/htrace-core-3.0.4.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/httpclient-4.5.4.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/httpcore-4.4.7.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/ivy-2.4.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jackson-annotations-2.6.7.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jackson-core-2.7.9.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jackson-core-asl-1.9.13.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jackson-databind-2.6.7.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jackson-dataformat-yaml-2.6.7.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jackson-jaxrs-1.9.13.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jackson-mapper-asl-1.9.13.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jackson-module-jaxb-annotations-2.7.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jackson-module-paranamer-2.7.9.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jackson-module-scala_2.11-2.6.7.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jackson-xc-1.9.13.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/janino-3.0.8.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/java-xmlbuilder-1.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/javassist-3.18.1-GA.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/javax.annotation-api-1.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/javax.inject-1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/javax.inject-2.4.0-b34.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/javax.servlet-api-3.1.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/javax.ws.rs-api-2.0.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/javolution-5.5.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jaxb-api-2.2.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jcl-over-slf4j-1.7.16.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jdo-api-3.0.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jersey-client-2.22.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jersey-common-2.22.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jersey-container-servlet-2.22.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jersey-container-servlet-core-2.22.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jersey-guava-2.22.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jersey-media-jaxb-2.22.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jersey-server-2.22.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jets3t-0.9.4.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jettison-1.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-6.1.26.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-client-9.3.20.v20170531.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-continuation-9.3.20.v20170531.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-http-9.3.20.v20170531.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-io-9.3.20.v20170531.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-jndi-9.3.20.v20170531.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-plus-9.3.20.v20170531.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-proxy-9.3.20.v20170531.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-security-9.3.20.v20170531.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-server-9.3.20.v20170531.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-servlet-9.3.20.v20170531.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-servlets-9.3.20.v20170531.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-util-6.1.26.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-util-9.3.20.v20170531.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-webapp-9.3.20.v20170531.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-xml-9.3.20.v20170531.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jline-2.12.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/joda-time-2.9.9.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jodd-core-3.5.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jpam-1.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/json4s-ast_2.11-3.2.11.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/json4s-core_2.11-3.2.11.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/json4s-jackson_2.11-3.2.11.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jsr305-3.0.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jta-1.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jtransforms-2.4.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jul-to-slf4j-1.7.16.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/kryo-shaded-3.0.3.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/kubernetes-client-3.0.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/kubernetes-model-2.0.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/leveldbjni-all-1.8.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/libfb303-0.9.3.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/libthrift-0.9.3.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/log4j-1.2.17.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/logging-interceptor-3.8.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/lz4-java-1.4.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/machinist_2.11-0.6.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/macro-compat_2.11-1.1.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/mesos-1.4.0-shaded-protobuf.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/metrics-core-3.1.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/metrics-graphite-3.1.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/metrics-json-3.1.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/metrics-jvm-3.1.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/minlog-1.3.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/netty-3.9.9.Final.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/netty-all-4.1.17.Final.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/objenesis-2.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/okhttp-3.8.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/okio-1.13.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/opencsv-2.3.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/orc-core-1.4.1-nohive.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/orc-mapreduce-1.4.1-nohive.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/oro-2.0.8.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/osgi-resource-locator-1.0.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/paranamer-2.8.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/parquet-column-1.8.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/parquet-common-1.8.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/parquet-encoding-1.8.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/parquet-format-2.3.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/parquet-hadoop-1.8.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/parquet-hadoop-bundle-1.6.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/parquet-jackson-1.8.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/pmml-model-1.2.15.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/pmml-schema-1.2.15.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/protobuf-java-2.5.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/py4j-0.10.6.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/pyrolite-4.13.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/scala-compiler-2.11.8.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/scala-library-2.11.8.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/scala-parser-combinators_2.11-1.0.4.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/scala-reflect-2.11.8.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/scala-xml_2.11-1.0.4.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/scalap-2.11.8.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/shapeless_2.11-2.3.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/slf4j-api-1.7.25.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/slf4j-log4j12-1.7.16.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/snakeyaml-1.15.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/snappy-0.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/snappy-java-1.1.2.6.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-assembly_2.11-2.3.0-SNAPSHOT-tests.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-assembly_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-catalyst_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-core_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-graphx_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-hive-thriftserver_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-hive_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-kubernetes_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-kvstore_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-launcher_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-mesos_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-mllib-local_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-mllib_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-network-common_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-network-shuffle_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-repl_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-sketch_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-sql_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-streaming_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-tags_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-unsafe_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-yarn_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spire-macros_2.11-0.13.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spire_2.11-0.13.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/stax-api-1.0-2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/stax-api-1.0.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/stream-2.7.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/stringtemplate-3.2.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/super-csv-2.2.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/univocity-parsers-2.5.9.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/unused-1.0.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/validation-api-1.1.0.Final.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/xbean-asm5-shaded-4.4.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/xercesImpl-2.9.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/xml-apis-1.3.04.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/xmlenc-0.52.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/xz-1.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/zjsonpatch-0.3.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/zookeeper-3.4.6.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/zstd-jni-1.3.2-2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/licenses/LICENSE-AnchorJS.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-DPark.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-Mockito.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-SnapTree.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-antlr.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-boto.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-cloudpickle.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-d3.min.js.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-dagre-d3.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-f2j.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-graphlib-dot.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-heapq.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-javolution.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-jbcrypt.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-jline.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-jpmml-model.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-jquery.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-junit-interface.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-kryo.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-minlog.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-modernizr.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-netlib.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-paranamer.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-postgresql.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-protobuf.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-py4j.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-pyrolite.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-reflectasm.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-sbt-launch-lib.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-scala.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-scalacheck.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-scopt.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-slf4j.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-sorttable.js.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-spire.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-xmlenc.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-zstd-jni.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-zstd.txt -> pyspark-2.3.0.dev0/deps/licenses
copying lib/py4j-0.10.6-src.zip -> pyspark-2.3.0.dev0/lib
copying lib/pyspark.zip -> pyspark-2.3.0.dev0/lib
copying pyspark/__init__.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/accumulators.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/broadcast.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/cloudpickle.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/conf.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/context.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/daemon.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/files.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/find_spark_home.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/heapq3.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/java_gateway.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/join.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/profiler.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/rdd.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/rddsampler.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/resultiterable.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/serializers.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/shell.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/shuffle.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/statcounter.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/status.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/storagelevel.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/taskcontext.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/tests.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/traceback_utils.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/util.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/version.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/worker.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark.egg-info/PKG-INFO -> pyspark-2.3.0.dev0/pyspark.egg-info
copying pyspark.egg-info/SOURCES.txt -> pyspark-2.3.0.dev0/pyspark.egg-info
copying pyspark.egg-info/dependency_links.txt -> pyspark-2.3.0.dev0/pyspark.egg-info
copying pyspark.egg-info/requires.txt -> pyspark-2.3.0.dev0/pyspark.egg-info
copying pyspark.egg-info/top_level.txt -> pyspark-2.3.0.dev0/pyspark.egg-info
copying pyspark/ml/__init__.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/base.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/classification.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/clustering.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/common.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/evaluation.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/feature.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/fpm.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/image.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/pipeline.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/recommendation.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/regression.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/stat.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/tests.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/tuning.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/util.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/wrapper.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/linalg/__init__.py -> pyspark-2.3.0.dev0/pyspark/ml/linalg
copying pyspark/ml/param/__init__.py -> pyspark-2.3.0.dev0/pyspark/ml/param
copying pyspark/ml/param/_shared_params_code_gen.py -> pyspark-2.3.0.dev0/pyspark/ml/param
copying pyspark/ml/param/shared.py -> pyspark-2.3.0.dev0/pyspark/ml/param
copying pyspark/mllib/__init__.py -> pyspark-2.3.0.dev0/pyspark/mllib
copying pyspark/mllib/classification.py -> pyspark-2.3.0.dev0/pyspark/mllib
copying pyspark/mllib/clustering.py -> pyspark-2.3.0.dev0/pyspark/mllib
copying pyspark/mllib/common.py -> pyspark-2.3.0.dev0/pyspark/mllib
copying pyspark/mllib/evaluation.py -> pyspark-2.3.0.dev0/pyspark/mllib
copying pyspark/mllib/feature.py -> pyspark-2.3.0.dev0/pyspark/mllib
copying pyspark/mllib/fpm.py -> pyspark-2.3.0.dev0/pyspark/mllib
copying pyspark/mllib/random.py -> pyspark-2.3.0.dev0/pyspark/mllib
copying pyspark/mllib/recommendation.py -> pyspark-2.3.0.dev0/pyspark/mllib
copying pyspark/mllib/regression.py -> pyspark-2.3.0.dev0/pyspark/mllib
copying pyspark/mllib/tests.py -> pyspark-2.3.0.dev0/pyspark/mllib
copying pyspark/mllib/tree.py -> pyspark-2.3.0.dev0/pyspark/mllib
copying pyspark/mllib/util.py -> pyspark-2.3.0.dev0/pyspark/mllib
copying pyspark/mllib/linalg/__init__.py -> pyspark-2.3.0.dev0/pyspark/mllib/linalg
copying pyspark/mllib/linalg/distributed.py -> pyspark-2.3.0.dev0/pyspark/mllib/linalg
copying pyspark/mllib/stat/KernelDensity.py -> pyspark-2.3.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/__init__.py -> pyspark-2.3.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/_statistics.py -> pyspark-2.3.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/distribution.py -> pyspark-2.3.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/test.py -> pyspark-2.3.0.dev0/pyspark/mllib/stat
copying pyspark/python/pyspark/shell.py -> pyspark-2.3.0.dev0/pyspark/python/pyspark
copying pyspark/sql/__init__.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/sql/catalog.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/sql/column.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/sql/conf.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/sql/context.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/sql/dataframe.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/sql/functions.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/sql/group.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/sql/readwriter.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/sql/session.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/sql/streaming.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/sql/tests.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/sql/types.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/sql/udf.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/sql/utils.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/sql/window.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/streaming/__init__.py -> pyspark-2.3.0.dev0/pyspark/streaming
copying pyspark/streaming/context.py -> pyspark-2.3.0.dev0/pyspark/streaming
copying pyspark/streaming/dstream.py -> pyspark-2.3.0.dev0/pyspark/streaming
copying pyspark/streaming/flume.py -> pyspark-2.3.0.dev0/pyspark/streaming
copying pyspark/streaming/kafka.py -> pyspark-2.3.0.dev0/pyspark/streaming
copying pyspark/streaming/kinesis.py -> pyspark-2.3.0.dev0/pyspark/streaming
copying pyspark/streaming/listener.py -> pyspark-2.3.0.dev0/pyspark/streaming
copying pyspark/streaming/tests.py -> pyspark-2.3.0.dev0/pyspark/streaming
copying pyspark/streaming/util.py -> pyspark-2.3.0.dev0/pyspark/streaming
Writing pyspark-2.3.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'pyspark-2.3.0.dev0' (and everything under it)
Installing dist into virtual env
Processing ./python/dist/pyspark-2.3.0.dev0.tar.gz
Collecting py4j==0.10.6 (from pyspark==2.3.0.dev0)
  Downloading py4j-0.10.6-py2.py3-none-any.whl (189kB)
Installing collected packages: py4j, pyspark
  Running setup.py install for pyspark: started
    Running setup.py install for pyspark: finished with status 'done'
Successfully installed py4j-0.10.6 pyspark-2.3.0.dev0
Run basic sanity check on pip installed version with spark-submit
2018-01-03 19:09:03 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2018-01-03 19:09:03 INFO  SparkContext:54 - Running Spark version 2.3.0-SNAPSHOT
2018-01-03 19:09:03 INFO  SparkContext:54 - Submitted application: PipSanityCheck
2018-01-03 19:09:03 INFO  SecurityManager:54 - Changing view acls to: jenkins
2018-01-03 19:09:03 INFO  SecurityManager:54 - Changing modify acls to: jenkins
2018-01-03 19:09:03 INFO  SecurityManager:54 - Changing view acls groups to: 
2018-01-03 19:09:03 INFO  SecurityManager:54 - Changing modify acls groups to: 
2018-01-03 19:09:03 INFO  SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
2018-01-03 19:09:04 INFO  Utils:54 - Successfully started service 'sparkDriver' on port 43931.
2018-01-03 19:09:04 INFO  SparkEnv:54 - Registering MapOutputTracker
2018-01-03 19:09:04 INFO  SparkEnv:54 - Registering BlockManagerMaster
2018-01-03 19:09:04 INFO  BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
2018-01-03 19:09:04 INFO  BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up
2018-01-03 19:09:04 INFO  DiskBlockManager:54 - Created local directory at /tmp/blockmgr-31e8e4c6-3712-4d2c-84ad-7f00a23e4e8d
2018-01-03 19:09:04 INFO  MemoryStore:54 - MemoryStore started with capacity 366.3 MB
2018-01-03 19:09:04 INFO  SparkEnv:54 - Registering OutputCommitCoordinator
2018-01-03 19:09:04 INFO  log:192 - Logging initialized @2082ms
2018-01-03 19:09:04 INFO  Server:346 - jetty-9.3.20.v20170531
2018-01-03 19:09:04 INFO  Server:414 - Started @2159ms
2018-01-03 19:09:04 INFO  AbstractConnector:278 - Started ServerConnector@7085ae15{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2018-01-03 19:09:04 INFO  Utils:54 - Successfully started service 'SparkUI' on port 4040.
2018-01-03 19:09:04 INFO  ContextHandler:781 - Started o.e.j.s.ServletContextHandler@21082f42{/jobs,null,AVAILABLE,@Spark}
2018-01-03 19:09:04 INFO  ContextHandler:781 - Started o.e.j.s.ServletContextHandler@5eeda9fb{/jobs/json,null,AVAILABLE,@Spark}
2018-01-03 19:09:04 INFO  ContextHandler:781 - Started o.e.j.s.ServletContextHandler@39fef5bd{/jobs/job,null,AVAILABLE,@Spark}
2018-01-03 19:09:04 INFO  ContextHandler:781 - Started o.e.j.s.ServletContextHandler@2e7bed98{/jobs/job/json,null,AVAILABLE,@Spark}
2018-01-03 19:09:04 INFO  ContextHandler:781 - Started o.e.j.s.ServletContextHandler@1dea3bdd{/stages,null,AVAILABLE,@Spark}
2018-01-03 19:09:04 INFO  ContextHandler:781 - Started o.e.j.s.ServletContextHandler@53f776c0{/stages/json,null,AVAILABLE,@Spark}
2018-01-03 19:09:04 INFO  ContextHandler:781 - Started o.e.j.s.ServletContextHandler@7aaf7b8b{/stages/stage,null,AVAILABLE,@Spark}
2018-01-03 19:09:04 INFO  ContextHandler:781 - Started o.e.j.s.ServletContextHandler@2f98875c{/stages/stage/json,null,AVAILABLE,@Spark}
2018-01-03 19:09:04 INFO  ContextHandler:781 - Started o.e.j.s.ServletContextHandler@2e6ac207{/stages/pool,null,AVAILABLE,@Spark}
2018-01-03 19:09:04 INFO  ContextHandler:781 - Started o.e.j.s.ServletContextHandler@6f917ba7{/stages/pool/json,null,AVAILABLE,@Spark}
2018-01-03 19:09:04 INFO  ContextHandler:781 - Started o.e.j.s.ServletContextHandler@25aa5f81{/storage,null,AVAILABLE,@Spark}
2018-01-03 19:09:04 INFO  ContextHandler:781 - Started o.e.j.s.ServletContextHandler@4e638e0f{/storage/json,null,AVAILABLE,@Spark}
2018-01-03 19:09:04 INFO  ContextHandler:781 - Started o.e.j.s.ServletContextHandler@3661173e{/storage/rdd,null,AVAILABLE,@Spark}
2018-01-03 19:09:04 INFO  ContextHandler:781 - Started o.e.j.s.ServletContextHandler@52c90e16{/storage/rdd/json,null,AVAILABLE,@Spark}
2018-01-03 19:09:04 INFO  ContextHandler:781 - Started o.e.j.s.ServletContextHandler@25680599{/environment,null,AVAILABLE,@Spark}
2018-01-03 19:09:04 INFO  ContextHandler:781 - Started o.e.j.s.ServletContextHandler@77a9f4cd{/environment/json,null,AVAILABLE,@Spark}
2018-01-03 19:09:04 INFO  ContextHandler:781 - Started o.e.j.s.ServletContextHandler@49cfc5e1{/executors,null,AVAILABLE,@Spark}
2018-01-03 19:09:04 INFO  ContextHandler:781 - Started o.e.j.s.ServletContextHandler@536cdb70{/executors/json,null,AVAILABLE,@Spark}
2018-01-03 19:09:04 INFO  ContextHandler:781 - Started o.e.j.s.ServletContextHandler@36cd4849{/executors/threadDump,null,AVAILABLE,@Spark}
2018-01-03 19:09:04 INFO  ContextHandler:781 - Started o.e.j.s.ServletContextHandler@efaddc1{/executors/threadDump/json,null,AVAILABLE,@Spark}
2018-01-03 19:09:04 INFO  ContextHandler:781 - Started o.e.j.s.ServletContextHandler@39adf132{/static,null,AVAILABLE,@Spark}
2018-01-03 19:09:04 INFO  ContextHandler:781 - Started o.e.j.s.ServletContextHandler@703bf59f{/,null,AVAILABLE,@Spark}
2018-01-03 19:09:04 INFO  ContextHandler:781 - Started o.e.j.s.ServletContextHandler@6679a2d5{/api,null,AVAILABLE,@Spark}
2018-01-03 19:09:04 INFO  ContextHandler:781 - Started o.e.j.s.ServletContextHandler@701ebac2{/jobs/job/kill,null,AVAILABLE,@Spark}
2018-01-03 19:09:04 INFO  ContextHandler:781 - Started o.e.j.s.ServletContextHandler@53001f40{/stages/stage/kill,null,AVAILABLE,@Spark}
2018-01-03 19:09:04 INFO  SparkUI:54 - Bound SparkUI to 0.0.0.0, and started at http://amp-jenkins-worker-08.amp:4040
2018-01-03 19:09:04 INFO  SparkContext:54 - Added file file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/dev/pip-sanity-check.py at file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/dev/pip-sanity-check.py with timestamp 1515035344643
2018-01-03 19:09:04 INFO  Utils:54 - Copying /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/dev/pip-sanity-check.py to /tmp/spark-aa26b16d-5853-4a15-8cc0-c0bc3a6d99c8/userFiles-19577726-d936-42a6-9c59-e61c72c192ba/pip-sanity-check.py
2018-01-03 19:09:04 INFO  Executor:54 - Starting executor ID driver on host localhost
2018-01-03 19:09:04 INFO  Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 34055.
2018-01-03 19:09:04 INFO  NettyBlockTransferService:54 - Server created on amp-jenkins-worker-08.amp:34055
2018-01-03 19:09:04 INFO  BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
2018-01-03 19:09:04 INFO  BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver, amp-jenkins-worker-08.amp, 34055, None)
2018-01-03 19:09:04 INFO  BlockManagerMasterEndpoint:54 - Registering block manager amp-jenkins-worker-08.amp:34055 with 366.3 MB RAM, BlockManagerId(driver, amp-jenkins-worker-08.amp, 34055, None)
2018-01-03 19:09:04 INFO  BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver, amp-jenkins-worker-08.amp, 34055, None)
2018-01-03 19:09:04 INFO  BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, amp-jenkins-worker-08.amp, 34055, None)
2018-01-03 19:09:04 INFO  ContextHandler:781 - Started o.e.j.s.ServletContextHandler@7b1b1ecc{/metrics/json,null,AVAILABLE,@Spark}
2018-01-03 19:09:05 INFO  SharedState:54 - Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/spark-warehouse').
2018-01-03 19:09:05 INFO  SharedState:54 - Warehouse path is 'file:/spark-warehouse'.
2018-01-03 19:09:05 INFO  ContextHandler:781 - Started o.e.j.s.ServletContextHandler@49e4b74c{/SQL,null,AVAILABLE,@Spark}
2018-01-03 19:09:05 INFO  ContextHandler:781 - Started o.e.j.s.ServletContextHandler@40f57fb3{/SQL/json,null,AVAILABLE,@Spark}
2018-01-03 19:09:05 INFO  ContextHandler:781 - Started o.e.j.s.ServletContextHandler@5c5edf60{/SQL/execution,null,AVAILABLE,@Spark}
2018-01-03 19:09:05 INFO  ContextHandler:781 - Started o.e.j.s.ServletContextHandler@6e20943e{/SQL/execution/json,null,AVAILABLE,@Spark}
2018-01-03 19:09:05 INFO  ContextHandler:781 - Started o.e.j.s.ServletContextHandler@5ce2a5bc{/static/sql,null,AVAILABLE,@Spark}
2018-01-03 19:09:05 INFO  StateStoreCoordinatorRef:54 - Registered StateStoreCoordinator endpoint
2018-01-03 19:09:05 INFO  SparkContext:54 - Starting job: reduce at /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/dev/pip-sanity-check.py:32
2018-01-03 19:09:05 INFO  DAGScheduler:54 - Got job 0 (reduce at /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/dev/pip-sanity-check.py:32) with 10 output partitions
2018-01-03 19:09:05 INFO  DAGScheduler:54 - Final stage: ResultStage 0 (reduce at /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/dev/pip-sanity-check.py:32)
2018-01-03 19:09:05 INFO  DAGScheduler:54 - Parents of final stage: List()
2018-01-03 19:09:05 INFO  DAGScheduler:54 - Missing parents: List()
2018-01-03 19:09:05 INFO  DAGScheduler:54 - Submitting ResultStage 0 (PythonRDD[1] at reduce at /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/dev/pip-sanity-check.py:32), which has no missing parents
2018-01-03 19:09:05 INFO  MemoryStore:54 - Block broadcast_0 stored as values in memory (estimated size 4.9 KB, free 366.3 MB)
2018-01-03 19:09:05 INFO  MemoryStore:54 - Block broadcast_0_piece0 stored as bytes in memory (estimated size 3.3 KB, free 366.3 MB)
2018-01-03 19:09:05 INFO  BlockManagerInfo:54 - Added broadcast_0_piece0 in memory on amp-jenkins-worker-08.amp:34055 (size: 3.3 KB, free: 366.3 MB)
2018-01-03 19:09:05 INFO  SparkContext:54 - Created broadcast 0 from broadcast at DAGScheduler.scala:1029
2018-01-03 19:09:05 INFO  DAGScheduler:54 - Submitting 10 missing tasks from ResultStage 0 (PythonRDD[1] at reduce at /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/dev/pip-sanity-check.py:32) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7, 8, 9))
2018-01-03 19:09:05 INFO  TaskSchedulerImpl:54 - Adding task set 0.0 with 10 tasks
2018-01-03 19:09:06 INFO  TaskSetManager:54 - Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 7839 bytes)
2018-01-03 19:09:06 INFO  TaskSetManager:54 - Starting task 1.0 in stage 0.0 (TID 1, localhost, executor driver, partition 1, PROCESS_LOCAL, 7839 bytes)
2018-01-03 19:09:06 INFO  TaskSetManager:54 - Starting task 2.0 in stage 0.0 (TID 2, localhost, executor driver, partition 2, PROCESS_LOCAL, 7839 bytes)
2018-01-03 19:09:06 INFO  TaskSetManager:54 - Starting task 3.0 in stage 0.0 (TID 3, localhost, executor driver, partition 3, PROCESS_LOCAL, 7839 bytes)
2018-01-03 19:09:06 INFO  TaskSetManager:54 - Starting task 4.0 in stage 0.0 (TID 4, localhost, executor driver, partition 4, PROCESS_LOCAL, 7839 bytes)
2018-01-03 19:09:06 INFO  TaskSetManager:54 - Starting task 5.0 in stage 0.0 (TID 5, localhost, executor driver, partition 5, PROCESS_LOCAL, 7839 bytes)
2018-01-03 19:09:06 INFO  TaskSetManager:54 - Starting task 6.0 in stage 0.0 (TID 6, localhost, executor driver, partition 6, PROCESS_LOCAL, 7839 bytes)
2018-01-03 19:09:06 INFO  TaskSetManager:54 - Starting task 7.0 in stage 0.0 (TID 7, localhost, executor driver, partition 7, PROCESS_LOCAL, 7839 bytes)
2018-01-03 19:09:06 INFO  TaskSetManager:54 - Starting task 8.0 in stage 0.0 (TID 8, localhost, executor driver, partition 8, PROCESS_LOCAL, 7839 bytes)
2018-01-03 19:09:06 INFO  TaskSetManager:54 - Starting task 9.0 in stage 0.0 (TID 9, localhost, executor driver, partition 9, PROCESS_LOCAL, 7839 bytes)
2018-01-03 19:09:06 INFO  Executor:54 - Running task 5.0 in stage 0.0 (TID 5)
2018-01-03 19:09:06 INFO  Executor:54 - Running task 0.0 in stage 0.0 (TID 0)
2018-01-03 19:09:06 INFO  Executor:54 - Running task 4.0 in stage 0.0 (TID 4)
2018-01-03 19:09:06 INFO  Executor:54 - Running task 7.0 in stage 0.0 (TID 7)
2018-01-03 19:09:06 INFO  Executor:54 - Running task 9.0 in stage 0.0 (TID 9)
2018-01-03 19:09:06 INFO  Executor:54 - Running task 8.0 in stage 0.0 (TID 8)
2018-01-03 19:09:06 INFO  Executor:54 - Running task 2.0 in stage 0.0 (TID 2)
2018-01-03 19:09:06 INFO  Executor:54 - Running task 3.0 in stage 0.0 (TID 3)
2018-01-03 19:09:06 INFO  Executor:54 - Running task 1.0 in stage 0.0 (TID 1)
2018-01-03 19:09:06 INFO  Executor:54 - Running task 6.0 in stage 0.0 (TID 6)
2018-01-03 19:09:06 INFO  Executor:54 - Fetching file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/dev/pip-sanity-check.py with timestamp 1515035344643
2018-01-03 19:09:06 INFO  Utils:54 - /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/dev/pip-sanity-check.py has been previously copied to /tmp/spark-aa26b16d-5853-4a15-8cc0-c0bc3a6d99c8/userFiles-19577726-d936-42a6-9c59-e61c72c192ba/pip-sanity-check.py
2018-01-03 19:09:06 INFO  PythonRunner:54 - Times: total = 372, boot = 360, init = 11, finish = 1
2018-01-03 19:09:06 INFO  PythonRunner:54 - Times: total = 373, boot = 370, init = 3, finish = 0
2018-01-03 19:09:06 INFO  PythonRunner:54 - Times: total = 371, boot = 364, init = 7, finish = 0
2018-01-03 19:09:06 INFO  PythonRunner:54 - Times: total = 372, boot = 366, init = 5, finish = 1
2018-01-03 19:09:06 INFO  PythonRunner:54 - Times: total = 377, boot = 373, init = 3, finish = 1
2018-01-03 19:09:06 INFO  PythonRunner:54 - Times: total = 380, boot = 376, init = 4, finish = 0
2018-01-03 19:09:06 INFO  PythonRunner:54 - Times: total = 383, boot = 379, init = 3, finish = 1
2018-01-03 19:09:06 INFO  PythonRunner:54 - Times: total = 385, boot = 382, init = 3, finish = 0
2018-01-03 19:09:06 INFO  PythonRunner:54 - Times: total = 387, boot = 385, init = 2, finish = 0
2018-01-03 19:09:06 INFO  PythonRunner:54 - Times: total = 390, boot = 387, init = 3, finish = 0
2018-01-03 19:09:06 INFO  Executor:54 - Finished task 7.0 in stage 0.0 (TID 7). 1311 bytes result sent to driver
2018-01-03 19:09:06 INFO  Executor:54 - Finished task 1.0 in stage 0.0 (TID 1). 1310 bytes result sent to driver
2018-01-03 19:09:06 INFO  Executor:54 - Finished task 0.0 in stage 0.0 (TID 0). 1310 bytes result sent to driver
2018-01-03 19:09:06 INFO  Executor:54 - Finished task 5.0 in stage 0.0 (TID 5). 1311 bytes result sent to driver
2018-01-03 19:09:06 INFO  Executor:54 - Finished task 4.0 in stage 0.0 (TID 4). 1311 bytes result sent to driver
2018-01-03 19:09:06 INFO  Executor:54 - Finished task 2.0 in stage 0.0 (TID 2). 1310 bytes result sent to driver
2018-01-03 19:09:06 INFO  Executor:54 - Finished task 8.0 in stage 0.0 (TID 8). 1311 bytes result sent to driver
2018-01-03 19:09:06 INFO  Executor:54 - Finished task 3.0 in stage 0.0 (TID 3). 1268 bytes result sent to driver
2018-01-03 19:09:06 INFO  Executor:54 - Finished task 6.0 in stage 0.0 (TID 6). 1268 bytes result sent to driver
2018-01-03 19:09:06 INFO  Executor:54 - Finished task 9.0 in stage 0.0 (TID 9). 1311 bytes result sent to driver
2018-01-03 19:09:06 INFO  TaskSetManager:54 - Finished task 0.0 in stage 0.0 (TID 0) in 542 ms on localhost (executor driver) (1/10)
2018-01-03 19:09:06 INFO  TaskSetManager:54 - Finished task 5.0 in stage 0.0 (TID 5) in 522 ms on localhost (executor driver) (2/10)
2018-01-03 19:09:06 INFO  TaskSetManager:54 - Finished task 1.0 in stage 0.0 (TID 1) in 524 ms on localhost (executor driver) (3/10)
2018-01-03 19:09:06 INFO  TaskSetManager:54 - Finished task 7.0 in stage 0.0 (TID 7) in 521 ms on localhost (executor driver) (4/10)
2018-01-03 19:09:06 INFO  TaskSetManager:54 - Finished task 4.0 in stage 0.0 (TID 4) in 523 ms on localhost (executor driver) (5/10)
2018-01-03 19:09:06 INFO  TaskSetManager:54 - Finished task 3.0 in stage 0.0 (TID 3) in 525 ms on localhost (executor driver) (6/10)
2018-01-03 19:09:06 INFO  TaskSetManager:54 - Finished task 8.0 in stage 0.0 (TID 8) in 522 ms on localhost (executor driver) (7/10)
2018-01-03 19:09:06 INFO  TaskSetManager:54 - Finished task 2.0 in stage 0.0 (TID 2) in 527 ms on localhost (executor driver) (8/10)
2018-01-03 19:09:06 INFO  TaskSetManager:54 - Finished task 6.0 in stage 0.0 (TID 6) in 525 ms on localhost (executor driver) (9/10)
2018-01-03 19:09:06 INFO  TaskSetManager:54 - Finished task 9.0 in stage 0.0 (TID 9) in 523 ms on localhost (executor driver) (10/10)
2018-01-03 19:09:06 INFO  TaskSchedulerImpl:54 - Removed TaskSet 0.0, whose tasks have all completed, from pool 
2018-01-03 19:09:06 INFO  DAGScheduler:54 - ResultStage 0 (reduce at /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/dev/pip-sanity-check.py:32) finished in 0.721 s
2018-01-03 19:09:06 INFO  DAGScheduler:54 - Job 0 finished: reduce at /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/dev/pip-sanity-check.py:32, took 0.765033 s
Successfully ran pip sanity check
2018-01-03 19:09:06 INFO  AbstractConnector:318 - Stopped Spark@7085ae15{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2018-01-03 19:09:06 INFO  SparkUI:54 - Stopped Spark web UI at http://amp-jenkins-worker-08.amp:4040
2018-01-03 19:09:06 INFO  MapOutputTrackerMasterEndpoint:54 - MapOutputTrackerMasterEndpoint stopped!
2018-01-03 19:09:06 INFO  MemoryStore:54 - MemoryStore cleared
2018-01-03 19:09:06 INFO  BlockManager:54 - BlockManager stopped
2018-01-03 19:09:06 INFO  BlockManagerMaster:54 - BlockManagerMaster stopped
2018-01-03 19:09:06 INFO  OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - OutputCommitCoordinator stopped!
2018-01-03 19:09:06 INFO  SparkContext:54 - Successfully stopped SparkContext
2018-01-03 19:09:07 INFO  ShutdownHookManager:54 - Shutdown hook called
2018-01-03 19:09:07 INFO  ShutdownHookManager:54 - Deleting directory /tmp/spark-aa26b16d-5853-4a15-8cc0-c0bc3a6d99c8
2018-01-03 19:09:07 INFO  ShutdownHookManager:54 - Deleting directory /tmp/spark-e482c525-26e3-4a18-9ca4-3c0bf90c39fa
2018-01-03 19:09:07 INFO  ShutdownHookManager:54 - Deleting directory /tmp/spark-aa26b16d-5853-4a15-8cc0-c0bc3a6d99c8/pyspark-48122c8d-b813-4747-a873-803e767c86ad
Run basic sanity check with import based
2018-01-03 19:09:09 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Successfully ran pip sanity check
Run the tests for context.py
2018-01-03 19:09:14 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
2018-01-03 19:09:17 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

[Stage 0:>                                                          (0 + 4) / 4]
                                                                                

[Stage 10:>                                                         (0 + 4) / 4]
[Stage 11:>                                                         (0 + 0) / 2]2018-01-03 19:09:28 WARN  PythonRunner:66 - Incomplete task 0.39 in stage 10 (TID 39) interrupted: Attempting to kill Python Worker
2018-01-03 19:09:28 WARN  PythonRunner:66 - Incomplete task 2.41 in stage 10 (TID 41) interrupted: Attempting to kill Python Worker
2018-01-03 19:09:28 WARN  PythonRunner:66 - Incomplete task 3.42 in stage 10 (TID 42) interrupted: Attempting to kill Python Worker
2018-01-03 19:09:28 WARN  PythonRunner:66 - Incomplete task 1.40 in stage 10 (TID 40) interrupted: Attempting to kill Python Worker
2018-01-03 19:09:28 WARN  TaskSetManager:66 - Lost task 2.0 in stage 10.0 (TID 41, localhost, executor driver): TaskKilled (Stage cancelled)
2018-01-03 19:09:28 WARN  TaskSetManager:66 - Lost task 3.0 in stage 10.0 (TID 42, localhost, executor driver): TaskKilled (Stage cancelled)
2018-01-03 19:09:28 WARN  TaskSetManager:66 - Lost task 0.0 in stage 10.0 (TID 39, localhost, executor driver): TaskKilled (Stage cancelled)
2018-01-03 19:09:28 WARN  TaskSetManager:66 - Lost task 1.0 in stage 10.0 (TID 40, localhost, executor driver): TaskKilled (Stage cancelled)

[Stage 11:>                                                         (0 + 2) / 2]
                                                                                
Testing pip installation with python 3.5
Using /tmp/tmp.9a42384Fo4 for virtualenv
Fetching package metadata ...........
Solving package specifications: .

Package plan for installation in environment /tmp/tmp.9a42384Fo4/3.5:

The following NEW packages will be INSTALLED:

    ca-certificates: 2017.08.26-h1d4fec5_0   
    certifi:         2017.11.5-py35h9749603_0
    intel-openmp:    2018.0.0-hc7b2577_8     
    libedit:         3.1-heed3624_0          
    libffi:          3.2.1-hd88cf55_4        
    libgcc-ng:       7.2.0-h7cc24e2_2        
    libstdcxx-ng:    7.2.0-h7a57d05_2        
    mkl:             2018.0.1-h19d6760_4     
    ncurses:         6.0-h9df7e31_2          
    numpy:           1.13.3-py35hd829ed6_0   
    openssl:         1.0.2n-hb7f436b_0       
    pandas:          0.22.0-py35hf484d3e_0   
    pip:             9.0.1-py35h7e7da9d_4    
    python:          3.5.4-h417fded_24       
    python-dateutil: 2.6.1-py35h90d5b31_1    
    pytz:            2017.3-py35hb13c558_0   
    readline:        7.0-ha6073c6_4          
    setuptools:      36.5.0-py35ha8c1747_0   
    six:             1.11.0-py35h423b573_1   
    sqlite:          3.20.1-hb898158_2       
    tk:              8.6.7-hc745277_3        
    wheel:           0.30.0-py35hd3883cf_1   
    xz:              5.2.3-h55aa19d_2        
    zlib:            1.2.11-ha838bed_2       

#
# To activate this environment, use:
# > source activate /tmp/tmp.9a42384Fo4/3.5
#
# To deactivate an active environment, use:
# > source deactivate
#

Creating pip installable source dist
running sdist
running egg_info
creating pyspark.egg-info
writing pyspark.egg-info/PKG-INFO
writing requirements to pyspark.egg-info/requires.txt
writing dependency_links to pyspark.egg-info/dependency_links.txt
writing top-level names to pyspark.egg-info/top_level.txt
writing manifest file 'pyspark.egg-info/SOURCES.txt'
Could not import pypandoc - required to package PySpark
package init file 'deps/bin/__init__.py' not found (or not a regular file)
package init file 'deps/jars/__init__.py' not found (or not a regular file)
package init file 'pyspark/python/pyspark/__init__.py' not found (or not a regular file)
package init file 'lib/__init__.py' not found (or not a regular file)
package init file 'deps/data/__init__.py' not found (or not a regular file)
package init file 'deps/licenses/__init__.py' not found (or not a regular file)
package init file 'deps/examples/__init__.py' not found (or not a regular file)
reading manifest file 'pyspark.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no previously-included files matching '*.py[cod]' found anywhere in distribution
warning: no previously-included files matching '__pycache__' found anywhere in distribution
warning: no previously-included files matching '.DS_Store' found anywhere in distribution
writing manifest file 'pyspark.egg-info/SOURCES.txt'
running check
creating pyspark-2.3.0.dev0
creating pyspark-2.3.0.dev0/deps
creating pyspark-2.3.0.dev0/deps/bin
creating pyspark-2.3.0.dev0/deps/data
creating pyspark-2.3.0.dev0/deps/data/graphx
creating pyspark-2.3.0.dev0/deps/data/mllib
creating pyspark-2.3.0.dev0/deps/data/mllib/als
creating pyspark-2.3.0.dev0/deps/data/mllib/images
creating pyspark-2.3.0.dev0/deps/data/mllib/images/kittens
creating pyspark-2.3.0.dev0/deps/data/mllib/ridge-data
creating pyspark-2.3.0.dev0/deps/data/streaming
creating pyspark-2.3.0.dev0/deps/examples
creating pyspark-2.3.0.dev0/deps/examples/ml
creating pyspark-2.3.0.dev0/deps/examples/mllib
creating pyspark-2.3.0.dev0/deps/examples/sql
creating pyspark-2.3.0.dev0/deps/examples/sql/streaming
creating pyspark-2.3.0.dev0/deps/examples/streaming
creating pyspark-2.3.0.dev0/deps/jars
creating pyspark-2.3.0.dev0/deps/licenses
creating pyspark-2.3.0.dev0/lib
creating pyspark-2.3.0.dev0/pyspark
creating pyspark-2.3.0.dev0/pyspark.egg-info
creating pyspark-2.3.0.dev0/pyspark/ml
creating pyspark-2.3.0.dev0/pyspark/ml/linalg
creating pyspark-2.3.0.dev0/pyspark/ml/param
creating pyspark-2.3.0.dev0/pyspark/mllib
creating pyspark-2.3.0.dev0/pyspark/mllib/linalg
creating pyspark-2.3.0.dev0/pyspark/mllib/stat
creating pyspark-2.3.0.dev0/pyspark/python
creating pyspark-2.3.0.dev0/pyspark/python/pyspark
creating pyspark-2.3.0.dev0/pyspark/sql
creating pyspark-2.3.0.dev0/pyspark/streaming
copying files to pyspark-2.3.0.dev0...
copying MANIFEST.in -> pyspark-2.3.0.dev0
copying README.md -> pyspark-2.3.0.dev0
copying setup.cfg -> pyspark-2.3.0.dev0
copying setup.py -> pyspark-2.3.0.dev0
copying deps/bin/beeline -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/beeline.cmd -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/find-spark-home -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/find-spark-home.cmd -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/load-spark-env.cmd -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/load-spark-env.sh -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/pyspark -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/pyspark.cmd -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/pyspark2.cmd -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/run-example -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/run-example.cmd -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/spark-class -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/spark-class.cmd -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/spark-class2.cmd -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/spark-shell -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/spark-shell.cmd -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/spark-shell2.cmd -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/spark-sql -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/spark-sql.cmd -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/spark-sql2.cmd -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/spark-submit -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/spark-submit.cmd -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/spark-submit2.cmd -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/sparkR -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/sparkR.cmd -> pyspark-2.3.0.dev0/deps/bin
copying deps/bin/sparkR2.cmd -> pyspark-2.3.0.dev0/deps/bin
copying deps/data/graphx/followers.txt -> pyspark-2.3.0.dev0/deps/data/graphx
copying deps/data/graphx/users.txt -> pyspark-2.3.0.dev0/deps/data/graphx
copying deps/data/mllib/gmm_data.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/iris_libsvm.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/kmeans_data.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/pagerank_data.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/pic_data.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_binary_classification_data.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_fpgrowth.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_isotonic_regression_libsvm_data.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_kmeans_data.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_lda_data.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_lda_libsvm_data.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_libsvm_data.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_linear_regression_data.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_movielens_data.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_multiclass_classification_data.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_svm_data.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/streaming_kmeans_data_test.txt -> pyspark-2.3.0.dev0/deps/data/mllib
copying deps/data/mllib/als/sample_movielens_ratings.txt -> pyspark-2.3.0.dev0/deps/data/mllib/als
copying deps/data/mllib/als/test.data -> pyspark-2.3.0.dev0/deps/data/mllib/als
copying deps/data/mllib/images/license.txt -> pyspark-2.3.0.dev0/deps/data/mllib/images
copying deps/data/mllib/images/kittens/not-image.txt -> pyspark-2.3.0.dev0/deps/data/mllib/images/kittens
copying deps/data/mllib/ridge-data/lpsa.data -> pyspark-2.3.0.dev0/deps/data/mllib/ridge-data
copying deps/data/streaming/AFINN-111.txt -> pyspark-2.3.0.dev0/deps/data/streaming
copying deps/examples/als.py -> pyspark-2.3.0.dev0/deps/examples
copying deps/examples/avro_inputformat.py -> pyspark-2.3.0.dev0/deps/examples
copying deps/examples/kmeans.py -> pyspark-2.3.0.dev0/deps/examples
copying deps/examples/logistic_regression.py -> pyspark-2.3.0.dev0/deps/examples
copying deps/examples/pagerank.py -> pyspark-2.3.0.dev0/deps/examples
copying deps/examples/parquet_inputformat.py -> pyspark-2.3.0.dev0/deps/examples
copying deps/examples/pi.py -> pyspark-2.3.0.dev0/deps/examples
copying deps/examples/sort.py -> pyspark-2.3.0.dev0/deps/examples
copying deps/examples/status_api_demo.py -> pyspark-2.3.0.dev0/deps/examples
copying deps/examples/transitive_closure.py -> pyspark-2.3.0.dev0/deps/examples
copying deps/examples/wordcount.py -> pyspark-2.3.0.dev0/deps/examples
copying deps/examples/ml/aft_survival_regression.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/als_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/binarizer_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/bisecting_k_means_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/bucketed_random_projection_lsh_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/bucketizer_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/chi_square_test_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/chisq_selector_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/correlation_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/count_vectorizer_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/cross_validator.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/dataframe_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/dct_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/decision_tree_classification_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/decision_tree_regression_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/elementwise_product_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/estimator_transformer_param_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/feature_hasher_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/fpgrowth_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/gaussian_mixture_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/generalized_linear_regression_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/gradient_boosted_tree_classifier_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/gradient_boosted_tree_regressor_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/imputer_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/index_to_string_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/isotonic_regression_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/kmeans_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/lda_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/linear_regression_with_elastic_net.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/linearsvc.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/logistic_regression_summary_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/logistic_regression_with_elastic_net.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/max_abs_scaler_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/min_hash_lsh_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/min_max_scaler_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/multiclass_logistic_regression_with_elastic_net.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/multilayer_perceptron_classification.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/n_gram_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/naive_bayes_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/normalizer_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/one_vs_rest_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/onehot_encoder_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/pca_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/pipeline_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/polynomial_expansion_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/quantile_discretizer_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/random_forest_classifier_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/random_forest_regressor_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/rformula_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/sql_transformer.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/standard_scaler_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/stopwords_remover_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/string_indexer_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/tf_idf_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/tokenizer_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/train_validation_split.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_assembler_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_indexer_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_slicer_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/ml/word2vec_example.py -> pyspark-2.3.0.dev0/deps/examples/ml
copying deps/examples/mllib/binary_classification_metrics_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/bisecting_k_means_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/correlations.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/correlations_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/decision_tree_classification_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/decision_tree_regression_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/elementwise_product_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/fpgrowth_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gaussian_mixture_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gaussian_mixture_model.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gradient_boosting_classification_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gradient_boosting_regression_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/hypothesis_testing_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/hypothesis_testing_kolmogorov_smirnov_test_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/isotonic_regression_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/k_means_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/kernel_density_estimation_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/kmeans.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/latent_dirichlet_allocation_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/linear_regression_with_sgd_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/logistic_regression.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/logistic_regression_with_lbfgs_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/multi_class_metrics_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/multi_label_metrics_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/naive_bayes_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/normalizer_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/pca_rowmatrix_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/power_iteration_clustering_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/random_forest_classification_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/random_forest_regression_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/random_rdd_generation.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/ranking_metrics_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/recommendation_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/regression_metrics_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/sampled_rdds.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/standard_scaler_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/stratified_sampling_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/streaming_k_means_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/streaming_linear_regression_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/summary_statistics_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/svd_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/svm_with_sgd_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/tf_idf_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/word2vec.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/word2vec_example.py -> pyspark-2.3.0.dev0/deps/examples/mllib
copying deps/examples/sql/basic.py -> pyspark-2.3.0.dev0/deps/examples/sql
copying deps/examples/sql/datasource.py -> pyspark-2.3.0.dev0/deps/examples/sql
copying deps/examples/sql/hive.py -> pyspark-2.3.0.dev0/deps/examples/sql
copying deps/examples/sql/streaming/structured_kafka_wordcount.py -> pyspark-2.3.0.dev0/deps/examples/sql/streaming
copying deps/examples/sql/streaming/structured_network_wordcount.py -> pyspark-2.3.0.dev0/deps/examples/sql/streaming
copying deps/examples/sql/streaming/structured_network_wordcount_windowed.py -> pyspark-2.3.0.dev0/deps/examples/sql/streaming
copying deps/examples/streaming/direct_kafka_wordcount.py -> pyspark-2.3.0.dev0/deps/examples/streaming
copying deps/examples/streaming/flume_wordcount.py -> pyspark-2.3.0.dev0/deps/examples/streaming
copying deps/examples/streaming/hdfs_wordcount.py -> pyspark-2.3.0.dev0/deps/examples/streaming
copying deps/examples/streaming/kafka_wordcount.py -> pyspark-2.3.0.dev0/deps/examples/streaming
copying deps/examples/streaming/network_wordcount.py -> pyspark-2.3.0.dev0/deps/examples/streaming
copying deps/examples/streaming/network_wordjoinsentiments.py -> pyspark-2.3.0.dev0/deps/examples/streaming
copying deps/examples/streaming/queue_stream.py -> pyspark-2.3.0.dev0/deps/examples/streaming
copying deps/examples/streaming/recoverable_network_wordcount.py -> pyspark-2.3.0.dev0/deps/examples/streaming
copying deps/examples/streaming/sql_network_wordcount.py -> pyspark-2.3.0.dev0/deps/examples/streaming
copying deps/examples/streaming/stateful_network_wordcount.py -> pyspark-2.3.0.dev0/deps/examples/streaming
copying deps/jars/JavaEWAH-0.3.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/RoaringBitmap-0.5.11.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/ST4-4.0.4.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/activation-1.1.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/aircompressor-0.8.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/antlr-2.7.7.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/antlr-runtime-3.4.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/antlr4-runtime-4.7.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/aopalliance-1.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/aopalliance-repackaged-2.4.0-b34.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/apache-log4j-extras-1.2.17.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/apacheds-i18n-2.0.0-M15.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/apacheds-kerberos-codec-2.0.0-M15.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/api-asn1-api-1.0.0-M20.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/api-util-1.0.0-M20.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/arpack_combined_all-0.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/arrow-format-0.8.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/arrow-memory-0.8.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/arrow-vector-0.8.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/automaton-1.11-8.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/avro-1.7.7.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/avro-ipc-1.7.7-tests.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/avro-ipc-1.7.7.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/avro-mapred-1.7.7-hadoop2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/base64-2.3.8.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/bcprov-jdk15on-1.52.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/bonecp-0.8.0.RELEASE.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/breeze-macros_2.11-0.13.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/breeze_2.11-0.13.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/calcite-avatica-1.2.0-incubating.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/calcite-core-1.2.0-incubating.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/calcite-linq4j-1.2.0-incubating.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/cglib-2.2.1-v20090111.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/chill-java-0.8.4.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/chill_2.11-0.8.4.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-beanutils-1.7.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-beanutils-core-1.8.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-cli-1.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-codec-1.11.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-collections-3.2.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-compiler-3.0.8.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-compress-1.4.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-configuration-1.6.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-crypto-1.0.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-dbcp-1.4.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-digester-1.8.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-httpclient-3.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-io-2.4.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-lang-2.6.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-lang3-3.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-logging-1.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-math3-3.4.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-net-3.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/commons-pool-1.5.4.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/compress-lzf-1.0.3.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/core-1.1.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/curator-client-2.6.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/curator-framework-2.6.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/curator-recipes-2.6.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/datanucleus-api-jdo-3.2.6.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/datanucleus-core-3.2.10.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/datanucleus-rdbms-3.2.9.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/derby-10.12.1.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/eigenbase-properties-1.1.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/flatbuffers-1.2.0-3f79e055.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/generex-1.0.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/gson-2.2.4.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/guava-14.0.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/guice-3.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hadoop-annotations-2.6.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hadoop-auth-2.6.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hadoop-client-2.6.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hadoop-common-2.6.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hadoop-hdfs-2.6.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-app-2.6.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-common-2.6.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-core-2.6.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-jobclient-2.6.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-shuffle-2.6.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-api-2.6.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-client-2.6.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-common-2.6.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-server-common-2.6.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-server-web-proxy-2.6.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hive-beeline-1.2.1.spark2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hive-cli-1.2.1.spark2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hive-exec-1.2.1.spark2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hive-jdbc-1.2.1.spark2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hive-metastore-1.2.1.spark2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hk2-api-2.4.0-b34.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hk2-locator-2.4.0-b34.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hk2-utils-2.4.0-b34.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/hppc-0.7.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/htrace-core-3.0.4.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/httpclient-4.5.4.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/httpcore-4.4.7.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/ivy-2.4.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jackson-annotations-2.6.7.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jackson-core-2.7.9.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jackson-core-asl-1.9.13.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jackson-databind-2.6.7.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jackson-dataformat-yaml-2.6.7.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jackson-jaxrs-1.9.13.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jackson-mapper-asl-1.9.13.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jackson-module-jaxb-annotations-2.7.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jackson-module-paranamer-2.7.9.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jackson-module-scala_2.11-2.6.7.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jackson-xc-1.9.13.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/janino-3.0.8.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/java-xmlbuilder-1.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/javassist-3.18.1-GA.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/javax.annotation-api-1.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/javax.inject-1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/javax.inject-2.4.0-b34.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/javax.servlet-api-3.1.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/javax.ws.rs-api-2.0.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/javolution-5.5.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jaxb-api-2.2.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jcl-over-slf4j-1.7.16.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jdo-api-3.0.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jersey-client-2.22.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jersey-common-2.22.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jersey-container-servlet-2.22.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jersey-container-servlet-core-2.22.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jersey-guava-2.22.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jersey-media-jaxb-2.22.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jersey-server-2.22.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jets3t-0.9.4.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jettison-1.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-6.1.26.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-client-9.3.20.v20170531.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-continuation-9.3.20.v20170531.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-http-9.3.20.v20170531.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-io-9.3.20.v20170531.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-jndi-9.3.20.v20170531.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-plus-9.3.20.v20170531.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-proxy-9.3.20.v20170531.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-security-9.3.20.v20170531.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-server-9.3.20.v20170531.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-servlet-9.3.20.v20170531.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-servlets-9.3.20.v20170531.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-util-6.1.26.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-util-9.3.20.v20170531.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-webapp-9.3.20.v20170531.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jetty-xml-9.3.20.v20170531.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jline-2.12.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/joda-time-2.9.9.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jodd-core-3.5.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jpam-1.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/json4s-ast_2.11-3.2.11.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/json4s-core_2.11-3.2.11.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/json4s-jackson_2.11-3.2.11.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jsr305-3.0.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jta-1.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jtransforms-2.4.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/jul-to-slf4j-1.7.16.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/kryo-shaded-3.0.3.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/kubernetes-client-3.0.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/kubernetes-model-2.0.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/leveldbjni-all-1.8.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/libfb303-0.9.3.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/libthrift-0.9.3.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/log4j-1.2.17.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/logging-interceptor-3.8.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/lz4-java-1.4.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/machinist_2.11-0.6.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/macro-compat_2.11-1.1.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/mesos-1.4.0-shaded-protobuf.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/metrics-core-3.1.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/metrics-graphite-3.1.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/metrics-json-3.1.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/metrics-jvm-3.1.5.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/minlog-1.3.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/netty-3.9.9.Final.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/netty-all-4.1.17.Final.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/objenesis-2.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/okhttp-3.8.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/okio-1.13.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/opencsv-2.3.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/orc-core-1.4.1-nohive.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/orc-mapreduce-1.4.1-nohive.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/oro-2.0.8.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/osgi-resource-locator-1.0.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/paranamer-2.8.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/parquet-column-1.8.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/parquet-common-1.8.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/parquet-encoding-1.8.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/parquet-format-2.3.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/parquet-hadoop-1.8.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/parquet-hadoop-bundle-1.6.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/parquet-jackson-1.8.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/pmml-model-1.2.15.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/pmml-schema-1.2.15.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/protobuf-java-2.5.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/py4j-0.10.6.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/pyrolite-4.13.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/scala-compiler-2.11.8.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/scala-library-2.11.8.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/scala-parser-combinators_2.11-1.0.4.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/scala-reflect-2.11.8.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/scala-xml_2.11-1.0.4.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/scalap-2.11.8.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/shapeless_2.11-2.3.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/slf4j-api-1.7.25.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/slf4j-log4j12-1.7.16.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/snakeyaml-1.15.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/snappy-0.2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/snappy-java-1.1.2.6.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-assembly_2.11-2.3.0-SNAPSHOT-tests.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-assembly_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-catalyst_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-core_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-graphx_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-hive-thriftserver_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-hive_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-kubernetes_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-kvstore_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-launcher_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-mesos_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-mllib-local_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-mllib_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-network-common_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-network-shuffle_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-repl_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-sketch_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-sql_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-streaming_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-tags_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-unsafe_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spark-yarn_2.11-2.3.0-SNAPSHOT.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spire-macros_2.11-0.13.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/spire_2.11-0.13.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/stax-api-1.0-2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/stax-api-1.0.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/stream-2.7.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/stringtemplate-3.2.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/super-csv-2.2.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/univocity-parsers-2.5.9.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/unused-1.0.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/validation-api-1.1.0.Final.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/xbean-asm5-shaded-4.4.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/xercesImpl-2.9.1.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/xml-apis-1.3.04.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/xmlenc-0.52.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/xz-1.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/zjsonpatch-0.3.0.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/zookeeper-3.4.6.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/jars/zstd-jni-1.3.2-2.jar -> pyspark-2.3.0.dev0/deps/jars
copying deps/licenses/LICENSE-AnchorJS.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-DPark.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-Mockito.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-SnapTree.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-antlr.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-boto.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-cloudpickle.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-d3.min.js.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-dagre-d3.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-f2j.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-graphlib-dot.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-heapq.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-javolution.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-jbcrypt.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-jline.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-jpmml-model.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-jquery.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-junit-interface.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-kryo.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-minlog.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-modernizr.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-netlib.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-paranamer.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-postgresql.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-protobuf.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-py4j.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-pyrolite.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-reflectasm.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-sbt-launch-lib.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-scala.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-scalacheck.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-scopt.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-slf4j.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-sorttable.js.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-spire.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-xmlenc.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-zstd-jni.txt -> pyspark-2.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-zstd.txt -> pyspark-2.3.0.dev0/deps/licenses
copying lib/py4j-0.10.6-src.zip -> pyspark-2.3.0.dev0/lib
copying lib/pyspark.zip -> pyspark-2.3.0.dev0/lib
copying pyspark/__init__.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/accumulators.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/broadcast.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/cloudpickle.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/conf.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/context.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/daemon.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/files.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/find_spark_home.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/heapq3.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/java_gateway.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/join.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/profiler.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/rdd.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/rddsampler.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/resultiterable.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/serializers.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/shell.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/shuffle.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/statcounter.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/status.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/storagelevel.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/taskcontext.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/tests.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/traceback_utils.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/util.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/version.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark/worker.py -> pyspark-2.3.0.dev0/pyspark
copying pyspark.egg-info/PKG-INFO -> pyspark-2.3.0.dev0/pyspark.egg-info
copying pyspark.egg-info/SOURCES.txt -> pyspark-2.3.0.dev0/pyspark.egg-info
copying pyspark.egg-info/dependency_links.txt -> pyspark-2.3.0.dev0/pyspark.egg-info
copying pyspark.egg-info/requires.txt -> pyspark-2.3.0.dev0/pyspark.egg-info
copying pyspark.egg-info/top_level.txt -> pyspark-2.3.0.dev0/pyspark.egg-info
copying pyspark/ml/__init__.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/base.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/classification.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/clustering.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/common.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/evaluation.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/feature.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/fpm.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/image.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/pipeline.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/recommendation.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/regression.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/stat.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/tests.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/tuning.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/util.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/wrapper.py -> pyspark-2.3.0.dev0/pyspark/ml
copying pyspark/ml/linalg/__init__.py -> pyspark-2.3.0.dev0/pyspark/ml/linalg
copying pyspark/ml/param/__init__.py -> pyspark-2.3.0.dev0/pyspark/ml/param
copying pyspark/ml/param/_shared_params_code_gen.py -> pyspark-2.3.0.dev0/pyspark/ml/param
copying pyspark/ml/param/shared.py -> pyspark-2.3.0.dev0/pyspark/ml/param
copying pyspark/mllib/__init__.py -> pyspark-2.3.0.dev0/pyspark/mllib
copying pyspark/mllib/classification.py -> pyspark-2.3.0.dev0/pyspark/mllib
copying pyspark/mllib/clustering.py -> pyspark-2.3.0.dev0/pyspark/mllib
copying pyspark/mllib/common.py -> pyspark-2.3.0.dev0/pyspark/mllib
copying pyspark/mllib/evaluation.py -> pyspark-2.3.0.dev0/pyspark/mllib
copying pyspark/mllib/feature.py -> pyspark-2.3.0.dev0/pyspark/mllib
copying pyspark/mllib/fpm.py -> pyspark-2.3.0.dev0/pyspark/mllib
copying pyspark/mllib/random.py -> pyspark-2.3.0.dev0/pyspark/mllib
copying pyspark/mllib/recommendation.py -> pyspark-2.3.0.dev0/pyspark/mllib
copying pyspark/mllib/regression.py -> pyspark-2.3.0.dev0/pyspark/mllib
copying pyspark/mllib/tests.py -> pyspark-2.3.0.dev0/pyspark/mllib
copying pyspark/mllib/tree.py -> pyspark-2.3.0.dev0/pyspark/mllib
copying pyspark/mllib/util.py -> pyspark-2.3.0.dev0/pyspark/mllib
copying pyspark/mllib/linalg/__init__.py -> pyspark-2.3.0.dev0/pyspark/mllib/linalg
copying pyspark/mllib/linalg/distributed.py -> pyspark-2.3.0.dev0/pyspark/mllib/linalg
copying pyspark/mllib/stat/KernelDensity.py -> pyspark-2.3.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/__init__.py -> pyspark-2.3.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/_statistics.py -> pyspark-2.3.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/distribution.py -> pyspark-2.3.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/test.py -> pyspark-2.3.0.dev0/pyspark/mllib/stat
copying pyspark/python/pyspark/shell.py -> pyspark-2.3.0.dev0/pyspark/python/pyspark
copying pyspark/sql/__init__.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/sql/catalog.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/sql/column.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/sql/conf.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/sql/context.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/sql/dataframe.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/sql/functions.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/sql/group.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/sql/readwriter.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/sql/session.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/sql/streaming.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/sql/tests.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/sql/types.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/sql/udf.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/sql/utils.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/sql/window.py -> pyspark-2.3.0.dev0/pyspark/sql
copying pyspark/streaming/__init__.py -> pyspark-2.3.0.dev0/pyspark/streaming
copying pyspark/streaming/context.py -> pyspark-2.3.0.dev0/pyspark/streaming
copying pyspark/streaming/dstream.py -> pyspark-2.3.0.dev0/pyspark/streaming
copying pyspark/streaming/flume.py -> pyspark-2.3.0.dev0/pyspark/streaming
copying pyspark/streaming/kafka.py -> pyspark-2.3.0.dev0/pyspark/streaming
copying pyspark/streaming/kinesis.py -> pyspark-2.3.0.dev0/pyspark/streaming
copying pyspark/streaming/listener.py -> pyspark-2.3.0.dev0/pyspark/streaming
copying pyspark/streaming/tests.py -> pyspark-2.3.0.dev0/pyspark/streaming
copying pyspark/streaming/util.py -> pyspark-2.3.0.dev0/pyspark/streaming
Writing pyspark-2.3.0.dev0/setup.cfg
Creating tar archive
removing 'pyspark-2.3.0.dev0' (and everything under it)
Installing dist into virtual env
Obtaining file:///home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/python
Collecting py4j==0.10.6 (from pyspark==2.3.0.dev0)
  Downloading py4j-0.10.6-py2.py3-none-any.whl (189kB)
Installing collected packages: py4j, pyspark
  Running setup.py develop for pyspark
Successfully installed py4j-0.10.6 pyspark
Run basic sanity check on pip installed version with spark-submit
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: target/unit-tests.log (No such file or directory)
	at java.io.FileOutputStream.open0(Native Method)
	at java.io.FileOutputStream.open(FileOutputStream.java:270)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
	at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
	at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
	at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
	at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
	at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
	at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
	at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
	at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
	at org.apache.spark.internal.Logging$class.initializeLogging(Logging.scala:120)
	at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:108)
	at org.apache.spark.deploy.SparkSubmit$.initializeLogIfNecessary(SparkSubmit.scala:70)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:127)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Successfully ran pip sanity check
Run basic sanity check with import based
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: target/unit-tests.log (No such file or directory)
	at java.io.FileOutputStream.open0(Native Method)
	at java.io.FileOutputStream.open(FileOutputStream.java:270)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
	at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
	at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
	at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
	at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
	at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
	at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
	at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
	at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
	at org.apache.spark.internal.Logging$class.initializeLogging(Logging.scala:120)
	at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:108)
	at org.apache.spark.deploy.SparkSubmit$.initializeLogIfNecessary(SparkSubmit.scala:70)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:127)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

[Stage 0:>                                                        (0 + 10) / 10]
                                                                                
Successfully ran pip sanity check
Run the tests for context.py
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: target/unit-tests.log (No such file or directory)
	at java.io.FileOutputStream.open0(Native Method)
	at java.io.FileOutputStream.open(FileOutputStream.java:270)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
	at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
	at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
	at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
	at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
	at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
	at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
	at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
	at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
	at org.apache.spark.internal.Logging$class.initializeLogging(Logging.scala:120)
	at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:108)
	at org.apache.spark.deploy.SparkSubmit$.initializeLogIfNecessary(SparkSubmit.scala:70)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:127)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: target/unit-tests.log (No such file or directory)
	at java.io.FileOutputStream.open0(Native Method)
	at java.io.FileOutputStream.open(FileOutputStream.java:270)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
	at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
	at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
	at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
	at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
	at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
	at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
	at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
	at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
	at org.apache.spark.internal.Logging$class.initializeLogging(Logging.scala:120)
	at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:108)
	at org.apache.spark.deploy.SparkSubmit$.initializeLogIfNecessary(SparkSubmit.scala:70)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:127)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

[Stage 0:>                                                          (0 + 4) / 4]
                                                                                

[Stage 10:>                                                         (0 + 4) / 4]
[Stage 11:>                                                         (0 + 0) / 2]
                                                                                
Cleaning up temporary directory - /tmp/tmp.9a42384Fo4

========================================================================
Running SparkR tests
========================================================================
Loading required package: methods

Attaching package: 'SparkR'

The following objects are masked from 'package:testthat':

    describe, not

The following objects are masked from 'package:stats':

    cov, filter, lag, na.omit, predict, sd, var, window

The following objects are masked from 'package:base':

    as.data.frame, colnames, colnames<-, drop, intersect, rank, rbind,
    sample, subset, summary, transform, union

Spark package found in SPARK_HOME: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6
basic tests for CRAN: .............

DONE ===========================================================================
SerDe functionality: Spark package found in SPARK_HOME: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6
...................
Windows-specific tests: S
functions on binary files: Spark package found in SPARK_HOME: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6
....
binary functions: Spark package found in SPARK_HOME: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6
...........
broadcast variables: Spark package found in SPARK_HOME: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6
..
functions in client.R: .....
test functions in sparkR.R: ..............................................
include R packages: Spark package found in SPARK_HOME: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6

JVM API: Spark package found in SPARK_HOME: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6
..
MLlib classification algorithms, except for tree-based algorithms: Spark package found in SPARK_HOME: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6
......................................................................
MLlib clustering algorithms: Spark package found in SPARK_HOME: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6
.....................................................................
MLlib frequent pattern mining: Spark package found in SPARK_HOME: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6
.....
MLlib recommendation algorithms: Spark package found in SPARK_HOME: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6
........
MLlib regression algorithms, except for tree-based algorithms: Spark package found in SPARK_HOME: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6
................................................................................................................................
MLlib statistics algorithms: Spark package found in SPARK_HOME: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6
........
MLlib tree-based algorithms: Spark package found in SPARK_HOME: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6
..............................................................................................
parallelize() and collect(): Spark package found in SPARK_HOME: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6
.............................
basic RDD functions: Spark package found in SPARK_HOME: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6
............................................................................................................................................................................................................................................................................................................................................................................................................................................
partitionBy, groupByKey, reduceByKey etc.: Spark package found in SPARK_HOME: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6
....................
functions in sparkR.R: ....
SparkSQL functions: Spark package found in SPARK_HOME: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6
........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................
Structured Streaming: Spark package found in SPARK_HOME: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6
.................................
tests RDD function take(): Spark package found in SPARK_HOME: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6
................
the textFile() function: Spark package found in SPARK_HOME: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6
.............
functions in utils.R: Spark package found in SPARK_HOME: /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6
.............................................

Skipped ------------------------------------------------------------------------
1. sparkJars tag in SparkContext (@test_Windows.R#22) - This test is only for Windows, skipped

DONE ===========================================================================
Using R_SCRIPT_PATH = /usr/bin
++++ dirname /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R/install-dev.sh
+++ cd /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R
+++ pwd
++ FWDIR=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R
++ LIB_DIR=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R/lib
++ mkdir -p /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R/lib
++ pushd /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R
++ . /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R/find-r.sh
+++ '[' -z /usr/bin ']'
++ . /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R/create-rd.sh
+++ set -o pipefail
+++ set -e
+++++ dirname /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R/create-rd.sh
++++ cd /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R
++++ pwd
+++ FWDIR=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R
+++ pushd /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R
+++ . /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R/find-r.sh
++++ '[' -z /usr/bin ']'
+++ /usr/bin/Rscript -e ' if("devtools" %in% rownames(installed.packages())) { library(devtools); devtools::document(pkg="./pkg", roclets=c("rd")) }'
Updating SparkR documentation
Loading SparkR
Loading required package: methods
Creating a new generic function for 'as.data.frame' in package 'SparkR'
Creating a new generic function for 'colnames' in package 'SparkR'
Creating a new generic function for 'colnames<-' in package 'SparkR'
Creating a new generic function for 'cov' in package 'SparkR'
Creating a new generic function for 'drop' in package 'SparkR'
Creating a new generic function for 'na.omit' in package 'SparkR'
Creating a new generic function for 'filter' in package 'SparkR'
Creating a new generic function for 'intersect' in package 'SparkR'
Creating a new generic function for 'sample' in package 'SparkR'
Creating a new generic function for 'transform' in package 'SparkR'
Creating a new generic function for 'subset' in package 'SparkR'
Creating a new generic function for 'summary' in package 'SparkR'
Creating a new generic function for 'union' in package 'SparkR'
Creating a new generic function for 'lag' in package 'SparkR'
Creating a new generic function for 'rank' in package 'SparkR'
Creating a new generic function for 'sd' in package 'SparkR'
Creating a new generic function for 'var' in package 'SparkR'
Creating a new generic function for 'window' in package 'SparkR'
Creating a new generic function for 'predict' in package 'SparkR'
Creating a new generic function for 'rbind' in package 'SparkR'
Creating a generic function for 'lapply' from package 'base' in package 'SparkR'
Creating a generic function for 'Filter' from package 'base' in package 'SparkR'
Creating a generic function for 'alias' from package 'stats' in package 'SparkR'
Creating a generic function for 'substr' from package 'base' in package 'SparkR'
Creating a generic function for '%in%' from package 'base' in package 'SparkR'
Creating a generic function for 'mean' from package 'base' in package 'SparkR'
Creating a generic function for 'unique' from package 'base' in package 'SparkR'
Creating a generic function for 'nrow' from package 'base' in package 'SparkR'
Creating a generic function for 'ncol' from package 'base' in package 'SparkR'
Creating a generic function for 'head' from package 'utils' in package 'SparkR'
Creating a generic function for 'factorial' from package 'base' in package 'SparkR'
Creating a generic function for 'atan2' from package 'base' in package 'SparkR'
Creating a generic function for 'ifelse' from package 'base' in package 'SparkR'
Warning messages:
1: In check_dep_version(pkg, version, compare) :
  Need roxygen2 >= 5.0.0 but loaded version is 4.1.1
2: In check_dep_version(pkg, version, compare) :
  Need roxygen2 >= 5.0.0 but loaded version is 4.1.1
++ /usr/bin/R CMD INSTALL --library=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R/lib /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R/pkg/
* installing *source* package 'SparkR' ...
** R
** inst
** preparing package for lazy loading
Creating a new generic function for 'as.data.frame' in package 'SparkR'
Creating a new generic function for 'colnames' in package 'SparkR'
Creating a new generic function for 'colnames<-' in package 'SparkR'
Creating a new generic function for 'cov' in package 'SparkR'
Creating a new generic function for 'drop' in package 'SparkR'
Creating a new generic function for 'na.omit' in package 'SparkR'
Creating a new generic function for 'filter' in package 'SparkR'
Creating a new generic function for 'intersect' in package 'SparkR'
Creating a new generic function for 'sample' in package 'SparkR'
Creating a new generic function for 'transform' in package 'SparkR'
Creating a new generic function for 'subset' in package 'SparkR'
Creating a new generic function for 'summary' in package 'SparkR'
Creating a new generic function for 'union' in package 'SparkR'
Creating a new generic function for 'lag' in package 'SparkR'
Creating a new generic function for 'rank' in package 'SparkR'
Creating a new generic function for 'sd' in package 'SparkR'
Creating a new generic function for 'var' in package 'SparkR'
Creating a new generic function for 'window' in package 'SparkR'
Creating a new generic function for 'predict' in package 'SparkR'
Creating a new generic function for 'rbind' in package 'SparkR'
Creating a generic function for 'lapply' from package 'base' in package 'SparkR'
Creating a generic function for 'Filter' from package 'base' in package 'SparkR'
Creating a generic function for 'alias' from package 'stats' in package 'SparkR'
Creating a generic function for 'substr' from package 'base' in package 'SparkR'
Creating a generic function for '%in%' from package 'base' in package 'SparkR'
Creating a generic function for 'mean' from package 'base' in package 'SparkR'
Creating a generic function for 'unique' from package 'base' in package 'SparkR'
Creating a generic function for 'nrow' from package 'base' in package 'SparkR'
Creating a generic function for 'ncol' from package 'base' in package 'SparkR'
Creating a generic function for 'head' from package 'utils' in package 'SparkR'
Creating a generic function for 'factorial' from package 'base' in package 'SparkR'
Creating a generic function for 'atan2' from package 'base' in package 'SparkR'
Creating a generic function for 'ifelse' from package 'base' in package 'SparkR'
** help
*** installing help indices
  converting help for package 'SparkR'
    finding HTML links ... done
    AFTSurvivalRegressionModel-class        html  
    ALSModel-class                          html  
    BisectingKMeansModel-class              html  
    DecisionTreeClassificationModel-class   html  
    DecisionTreeRegressionModel-class       html  
    FPGrowthModel-class                     html  
    GBTClassificationModel-class            html  
    GBTRegressionModel-class                html  
    GaussianMixtureModel-class              html  
    GeneralizedLinearRegressionModel-class
                                            html  
    GroupedData                             html  
    IsotonicRegressionModel-class           html  
    KMeansModel-class                       html  
    KSTest-class                            html  
    LDAModel-class                          html  
    LinearSVCModel-class                    html  
    LogisticRegressionModel-class           html  
    MultilayerPerceptronClassificationModel-class
                                            html  
    NaiveBayesModel-class                   html  
    RandomForestClassificationModel-class   html  
    RandomForestRegressionModel-class       html  
    SparkDataFrame                          html  
    StreamingQuery                          html  
    WindowSpec                              html  
    alias                                   html  
    approxQuantile                          html  
    arrange                                 html  
    as.data.frame                           html  
    attach                                  html  
    avg                                     html  
    awaitTermination                        html  
    between                                 html  
    broadcast                               html  
    cache                                   html  
    cacheTable                              html  
    cancelJobGroup                          html  
    cast                                    html  
    checkpoint                              html  
    clearCache                              html  
    clearJobGroup                           html  
    coalesce                                html  
    collect                                 html  
    coltypes                                html  
    column                                  html  
    column_aggregate_functions              html  
    column_collection_functions             html  
    column_datetime_diff_functions          html  
    column_datetime_functions               html  
    column_math_functions                   html  
    column_misc_functions                   html  
    column_nonaggregate_functions           html  
    column_string_functions                 html  
    column_window_functions                 html  
    columnfunctions                         html  
    columns                                 html  
    corr                                    html  
    count                                   html  
    cov                                     html  
    createDataFrame                         html  
    createExternalTable-deprecated          html  
    createOrReplaceTempView                 html  
    createTable                             html  
    crossJoin                               html  
    crosstab                                html  
    cube                                    html  
    currentDatabase                         html  
    dapply                                  html  
    dapplyCollect                           html  
    describe                                html  
    dim                                     html  
    distinct                                html  
    drop                                    html  
    dropDuplicates                          html  
    dropTempTable-deprecated                html  
    dropTempView                            html  
    dtypes                                  html  
    endsWith                                html  
    eq_null_safe                            html  
    except                                  html  
    explain                                 html  
    filter                                  html  
    first                                   html  
    fitted                                  html  
    freqItems                               html  
    gapply                                  html  
    gapplyCollect                           html  
    getLocalProperty                        html  
    getNumPartitions                        html  
    glm                                     html  
    groupBy                                 html  
    hashCode                                html  
    head                                    html  
    hint                                    html  
    histogram                               html  
    insertInto                              html  
    install.spark                           html  
    intersect                               html  
    isActive                                html  
    isLocal                                 html  
    isStreaming                             html  
    join                                    html  
    last                                    html  
    lastProgress                            html  
    limit                                   html  
    listColumns                             html  
    listDatabases                           html  
    listFunctions                           html  
    listTables                              html  
    localCheckpoint                         html  
    match                                   html  
    merge                                   html  
    mutate                                  html  
    nafunctions                             html  
    ncol                                    html  
    not                                     html  
    nrow                                    html  
    orderBy                                 html  
    otherwise                               html  
    over                                    html  
    partitionBy                             html  
    persist                                 html  
    pivot                                   html  
    predict                                 html  
    print.jobj                              html  
    print.structField                       html  
    print.structType                        html  
    printSchema                             html  
    queryName                               html  
    randomSplit                             html  
    rangeBetween                            html  
    rbind                                   html  
    read.df                                 html  
    read.jdbc                               html  
    read.json                               html  
    read.ml                                 html  
    read.orc                                html  
    read.parquet                            html  
    read.stream                             html  
    read.text                               html  
    recoverPartitions                       html  
    refreshByPath                           html  
    refreshTable                            html  
    registerTempTable-deprecated            html  
    rename                                  html  
    repartition                             html  
    rollup                                  html  
    rowsBetween                             html  
    sample                                  html  
    sampleBy                                html  
    saveAsTable                             html  
    schema                                  html  
    select                                  html  
    selectExpr                              html  
    setCheckpointDir                        html  
    setCurrentDatabase                      html  
    setJobDescription                       html  
    setJobGroup                             html  
    setLocalProperty                        html  
    setLogLevel                             html  
    show                                    html  
    showDF                                  html  
    spark.addFile                           html  
    spark.als                               html  
    spark.bisectingKmeans                   html  
    spark.decisionTree                      html  
    spark.fpGrowth                          html  
    spark.gaussianMixture                   html  
    spark.gbt                               html  
    spark.getSparkFiles                     html  
    spark.getSparkFilesRootDirectory        html  
    spark.glm                               html  
    spark.isoreg                            html  
    spark.kmeans                            html  
    spark.kstest                            html  
    spark.lapply                            html  
    spark.lda                               html  
    spark.logit                             html  
    spark.mlp                               html  
    spark.naiveBayes                        html  
    spark.randomForest                      html  
    spark.survreg                           html  
    spark.svmLinear                         html  
    sparkR.callJMethod                      html  
    sparkR.callJStatic                      html  
    sparkR.conf                             html  
    sparkR.init-deprecated                  html  
    sparkR.newJObject                       html  
    sparkR.session                          html  
    sparkR.session.stop                     html  
    sparkR.uiWebUrl                         html  
    sparkR.version                          html  
    sparkRHive.init-deprecated              html  
    sparkRSQL.init-deprecated               html  
    sql                                     html  
    startsWith                              html  
    status                                  html  
    stopQuery                               html  
    storageLevel                            html  
    str                                     html  
    structField                             html  
    structType                              html  
    subset                                  html  
    substr                                  html  
    summarize                               html  
    summary                                 html  
    tableNames                              html  
    tableToDF                               html  
    tables                                  html  
    take                                    html  
    toJSON                                  html  
    uncacheTable                            html  
    union                                   html  
    unionByName                             html  
    unpersist                               html  
    windowOrderBy                           html  
    windowPartitionBy                       html  
    with                                    html  
    withColumn                              html  
    write.df                                html  
    write.jdbc                              html  
    write.json                              html  
    write.ml                                html  
    write.orc                               html  
    write.parquet                           html  
    write.stream                            html  
    write.text                              html  
** building package indices
** installing vignettes
** testing if installed package can be loaded
* DONE (SparkR)
++ cd /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R/lib
++ jar cfM /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R/lib/sparkr.zip SparkR
++ popd
++ cd /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R/..
++ pwd
+ SPARK_HOME=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6
+ . /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/bin/load-spark-env.sh
++ '[' -z /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6 ']'
++ '[' -z '' ']'
++ export SPARK_ENV_LOADED=1
++ SPARK_ENV_LOADED=1
++ export SPARK_CONF_DIR=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/conf
++ SPARK_CONF_DIR=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/conf
++ '[' -f /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/conf/spark-env.sh ']'
++ '[' -z '' ']'
++ ASSEMBLY_DIR2=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/assembly/target/scala-2.11
++ ASSEMBLY_DIR1=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/assembly/target/scala-2.12
++ [[ -d /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/assembly/target/scala-2.11 ]]
++ [[ -d /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/assembly/target/scala-2.12 ]]
++ '[' -d /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/assembly/target/scala-2.11 ']'
++ export SPARK_SCALA_VERSION=2.11
++ SPARK_SCALA_VERSION=2.11
+ '[' -f /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/RELEASE ']'
+ SPARK_JARS_DIR=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/assembly/target/scala-2.11/jars
+ '[' -d /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/assembly/target/scala-2.11/jars ']'
+ SPARK_HOME=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6
+ /usr/bin/R CMD build /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R/pkg
* checking for file '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R/pkg/DESCRIPTION' ... OK
* preparing 'SparkR':
* checking DESCRIPTION meta-information ... OK
* installing the package to build vignettes
* creating vignettes ... OK
* checking for LF line-endings in source and make files
* checking for empty or unneeded directories
* building 'SparkR_2.3.0.tar.gz'

+ find pkg/vignettes/. -not -name . -not -name '*.Rmd' -not -name '*.md' -not -name '*.pdf' -not -name '*.html' -delete
++ grep Version /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R/pkg/DESCRIPTION
++ awk '{print $NF}'
+ VERSION=2.3.0
+ CRAN_CHECK_OPTIONS=--as-cran
+ '[' -n 1 ']'
+ CRAN_CHECK_OPTIONS='--as-cran --no-tests'
+ '[' -n 1 ']'
+ CRAN_CHECK_OPTIONS='--as-cran --no-tests --no-manual --no-vignettes'
+ echo 'Running CRAN check with --as-cran --no-tests --no-manual --no-vignettes options'
Running CRAN check with --as-cran --no-tests --no-manual --no-vignettes options
+ '[' -n 1 ']'
+ '[' -n 1 ']'
+ /usr/bin/R CMD check --as-cran --no-tests --no-manual --no-vignettes SparkR_2.3.0.tar.gz
* using log directory '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R/SparkR.Rcheck'
* using R version 3.1.1 (2014-07-10)
* using platform: x86_64-redhat-linux-gnu (64-bit)
* using session charset: ASCII
* using options '--no-tests --no-vignettes'
* checking for file 'SparkR/DESCRIPTION' ... OK
* checking extension type ... Package
* this is package 'SparkR' version '2.3.0'
* checking CRAN incoming feasibility ... NOTE
Maintainer: 'Shivaram Venkataraman <shivaram@cs.berkeley.edu>'
New submission
Package was archived on CRAN
Unknown, possibly mis-spelled, fields in DESCRIPTION:
  'RoxygenNote'
CRAN repository db overrides:
  X-CRAN-Comment: Archived on 2017-10-22 for policy violation.
* checking package namespace information ... OK
* checking package dependencies ... NOTE
  No repository set, so cyclic dependency check skipped
* checking if this is a source package ... OK
* checking if there is a namespace ... OK
* checking for executable files ... OK
* checking for hidden files and directories ... OK
* checking for portable file names ... OK
* checking for sufficient/correct file permissions ... OK
* checking whether package 'SparkR' can be installed ... OK
* checking installed package size ... OK
* checking package directory ... OK
* checking 'build' directory ... OK
* checking DESCRIPTION meta-information ... OK
* checking top-level files ... OK
* checking for left-over files ... OK
* checking index information ... OK
* checking package subdirectories ... OK
* checking R files for non-ASCII characters ... OK
* checking R files for syntax errors ... OK
* checking whether the package can be loaded ... OK
* checking whether the package can be loaded with stated dependencies ... OK
* checking whether the package can be unloaded cleanly ... OK
* checking whether the namespace can be loaded with stated dependencies ... OK
* checking whether the namespace can be unloaded cleanly ... OK
* checking loading without being on the library search path ... OK
* checking dependencies in R code ... OK
* checking S3 generic/method consistency ... OK
* checking replacement functions ... OK
* checking foreign function calls ... OK
* checking R code for possible problems ... OK
* checking Rd files ... OK
* checking Rd metadata ... OK
* checking Rd line widths ... OK
* checking Rd cross-references ... OK
* checking for missing documentation entries ... OK
* checking for code/documentation mismatches ... OK
* checking Rd \usage sections ... OK
* checking Rd contents ... OK
* checking for unstated dependencies in examples ... OK
* checking installed files from 'inst/doc' ... OK
* checking files in 'vignettes' ... OK
* checking examples ... OK
* checking for unstated dependencies in tests ... OK
* checking tests ... SKIPPED
* checking for unstated dependencies in vignettes ... OK
* checking package vignettes in 'inst/doc' ... OK
* checking running R code from vignettes ... SKIPPED
* checking re-building of vignette outputs ... SKIPPED

NOTE: There were 2 notes.
See
  '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/R/SparkR.Rcheck/00check.log'
for details.

+ popd
Tests passed.
Archiving artifacts
Recording test results
Finished: SUCCESS