SuccessConsole Output

Skipping 17,821 KB.. Full Log
d_udf_return_scalar (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped 'PyArrow >= 0.12.1 must be installed; however, your version was 0.8.0.'
    test_vectorized_udf_return_timestamp_tz (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped 'PyArrow >= 0.12.1 must be installed; however, your version was 0.8.0.'
    test_vectorized_udf_string_in_udf (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped 'PyArrow >= 0.12.1 must be installed; however, your version was 0.8.0.'
    test_vectorized_udf_struct_complex (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped 'PyArrow >= 0.12.1 must be installed; however, your version was 0.8.0.'
    test_vectorized_udf_struct_type (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped 'PyArrow >= 0.12.1 must be installed; however, your version was 0.8.0.'
    test_vectorized_udf_struct_with_empty_partition (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped 'PyArrow >= 0.12.1 must be installed; however, your version was 0.8.0.'
    test_vectorized_udf_timestamps (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped 'PyArrow >= 0.12.1 must be installed; however, your version was 0.8.0.'
    test_vectorized_udf_timestamps_respect_session_timezone (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped 'PyArrow >= 0.12.1 must be installed; however, your version was 0.8.0.'
    test_vectorized_udf_unsupported_types (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped 'PyArrow >= 0.12.1 must be installed; however, your version was 0.8.0.'
    test_vectorized_udf_varargs (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped 'PyArrow >= 0.12.1 must be installed; however, your version was 0.8.0.'
    test_vectorized_udf_wrong_return_type (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped 'PyArrow >= 0.12.1 must be installed; however, your version was 0.8.0.'

Skipped tests in pyspark.sql.tests.test_pandas_udf_window with python2.7:
    test_array_type (pyspark.sql.tests.test_pandas_udf_window.WindowPandasUDFTests) ... skipped 'PyArrow >= 0.12.1 must be installed; however, your version was 0.8.0.'
    test_bounded_mixed (pyspark.sql.tests.test_pandas_udf_window.WindowPandasUDFTests) ... skipped 'PyArrow >= 0.12.1 must be installed; however, your version was 0.8.0.'
    test_bounded_simple (pyspark.sql.tests.test_pandas_udf_window.WindowPandasUDFTests) ... skipped 'PyArrow >= 0.12.1 must be installed; however, your version was 0.8.0.'
    test_growing_window (pyspark.sql.tests.test_pandas_udf_window.WindowPandasUDFTests) ... skipped 'PyArrow >= 0.12.1 must be installed; however, your version was 0.8.0.'
    test_invalid_args (pyspark.sql.tests.test_pandas_udf_window.WindowPandasUDFTests) ... skipped 'PyArrow >= 0.12.1 must be installed; however, your version was 0.8.0.'
    test_mixed_sql (pyspark.sql.tests.test_pandas_udf_window.WindowPandasUDFTests) ... skipped 'PyArrow >= 0.12.1 must be installed; however, your version was 0.8.0.'
    test_mixed_sql_and_udf (pyspark.sql.tests.test_pandas_udf_window.WindowPandasUDFTests) ... skipped 'PyArrow >= 0.12.1 must be installed; however, your version was 0.8.0.'
    test_mixed_udf (pyspark.sql.tests.test_pandas_udf_window.WindowPandasUDFTests) ... skipped 'PyArrow >= 0.12.1 must be installed; however, your version was 0.8.0.'
    test_multiple_udfs (pyspark.sql.tests.test_pandas_udf_window.WindowPandasUDFTests) ... skipped 'PyArrow >= 0.12.1 must be installed; however, your version was 0.8.0.'
    test_replace_existing (pyspark.sql.tests.test_pandas_udf_window.WindowPandasUDFTests) ... skipped 'PyArrow >= 0.12.1 must be installed; however, your version was 0.8.0.'
    test_shrinking_window (pyspark.sql.tests.test_pandas_udf_window.WindowPandasUDFTests) ... skipped 'PyArrow >= 0.12.1 must be installed; however, your version was 0.8.0.'
    test_simple (pyspark.sql.tests.test_pandas_udf_window.WindowPandasUDFTests) ... skipped 'PyArrow >= 0.12.1 must be installed; however, your version was 0.8.0.'
    test_sliding_window (pyspark.sql.tests.test_pandas_udf_window.WindowPandasUDFTests) ... skipped 'PyArrow >= 0.12.1 must be installed; however, your version was 0.8.0.'
    test_without_partitionBy (pyspark.sql.tests.test_pandas_udf_window.WindowPandasUDFTests) ... skipped 'PyArrow >= 0.12.1 must be installed; however, your version was 0.8.0.'

Skipped tests in pyspark.streaming.tests.test_kinesis with python2.7:
    test_kinesis_stream (pyspark.streaming.tests.test_kinesis.KinesisStreamTests) ... skipped "Skipping all Kinesis Python tests as environmental variable 'ENABLE_KINESIS_TESTS' was not set."
    test_kinesis_stream_api (pyspark.streaming.tests.test_kinesis.KinesisStreamTests) ... skipped "Skipping all Kinesis Python tests as environmental variable 'ENABLE_KINESIS_TESTS' was not set."

========================================================================
Running PySpark packaging tests
========================================================================
Constructing virtual env for testing
Using conda virtual environments
Testing pip installation with python 3.5
Using /tmp/tmp.LP1X6uWOrw for virtualenv
Fetching package metadata ...........
Solving package specifications: .

Package plan for installation in environment /tmp/tmp.LP1X6uWOrw/3.5:

The following NEW packages will be INSTALLED:

    blas:            1.0-mkl                
    ca-certificates: 2019.1.23-0            
    certifi:         2018.8.24-py35_1       
    intel-openmp:    2019.3-199             
    libedit:         3.1.20181209-hc058e9b_0
    libffi:          3.2.1-hd88cf55_4       
    libgcc-ng:       8.2.0-hdf63c60_1       
    libgfortran-ng:  7.3.0-hdf63c60_0       
    libstdcxx-ng:    8.2.0-hdf63c60_1       
    mkl:             2018.0.3-1             
    mkl_fft:         1.0.6-py35h7dd41cf_0   
    mkl_random:      1.0.1-py35h4414c95_1   
    ncurses:         6.1-he6710b0_1         
    numpy:           1.15.2-py35h1d66e8a_0  
    numpy-base:      1.15.2-py35h81de0dd_0  
    openssl:         1.0.2r-h7b6447c_0      
    pandas:          0.23.4-py35h04863e7_0  
    pip:             10.0.1-py35_0          
    python:          3.5.6-hc3d631a_0       
    python-dateutil: 2.7.3-py35_0           
    pytz:            2019.1-py_0            
    readline:        7.0-h7b6447c_5         
    setuptools:      40.2.0-py35_0          
    six:             1.11.0-py35_1          
    sqlite:          3.28.0-h7b6447c_0      
    tk:              8.6.8-hbc83047_0       
    wheel:           0.31.1-py35_0          
    xz:              5.2.4-h14c3975_4       
    zlib:            1.2.11-h7b6447c_3      

#
# To activate this environment, use:
# > source activate /tmp/tmp.LP1X6uWOrw/3.5
#
# To deactivate an active environment, use:
# > source deactivate
#

Creating pip installable source dist
Could not import pypandoc - required to package PySpark
zip_safe flag not set; analyzing archive contents...
pypandoc.__pycache__.__init__.cpython-35: module references __file__

Installed /home/jenkins/workspace/ubuntuSparkPRB/python/.eggs/pypandoc-1.4-py3.5.egg
running sdist
running egg_info
creating pyspark.egg-info
writing pyspark.egg-info/PKG-INFO
writing top-level names to pyspark.egg-info/top_level.txt
writing requirements to pyspark.egg-info/requires.txt
writing dependency_links to pyspark.egg-info/dependency_links.txt
writing manifest file 'pyspark.egg-info/SOURCES.txt'
package init file 'deps/bin/__init__.py' not found (or not a regular file)
package init file 'deps/sbin/__init__.py' not found (or not a regular file)
package init file 'deps/jars/__init__.py' not found (or not a regular file)
package init file 'pyspark/python/pyspark/__init__.py' not found (or not a regular file)
package init file 'lib/__init__.py' not found (or not a regular file)
package init file 'deps/data/__init__.py' not found (or not a regular file)
package init file 'deps/licenses/__init__.py' not found (or not a regular file)
package init file 'deps/examples/__init__.py' not found (or not a regular file)
reading manifest file 'pyspark.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no previously-included files matching '*.py[cod]' found anywhere in distribution
warning: no previously-included files matching '__pycache__' found anywhere in distribution
warning: no previously-included files matching '.DS_Store' found anywhere in distribution
writing manifest file 'pyspark.egg-info/SOURCES.txt'
running check
creating pyspark-3.0.0.dev0
creating pyspark-3.0.0.dev0/deps
creating pyspark-3.0.0.dev0/deps/bin
creating pyspark-3.0.0.dev0/deps/data
creating pyspark-3.0.0.dev0/deps/data/graphx
creating pyspark-3.0.0.dev0/deps/data/mllib
creating pyspark-3.0.0.dev0/deps/data/mllib/als
creating pyspark-3.0.0.dev0/deps/data/mllib/images
creating pyspark-3.0.0.dev0/deps/data/mllib/images/origin
creating pyspark-3.0.0.dev0/deps/data/mllib/images/origin/kittens
creating pyspark-3.0.0.dev0/deps/data/mllib/images/partitioned
creating pyspark-3.0.0.dev0/deps/data/mllib/images/partitioned/cls=kittens
creating pyspark-3.0.0.dev0/deps/data/mllib/images/partitioned/cls=kittens/date=2018-01
creating pyspark-3.0.0.dev0/deps/data/mllib/ridge-data
creating pyspark-3.0.0.dev0/deps/data/streaming
creating pyspark-3.0.0.dev0/deps/examples
creating pyspark-3.0.0.dev0/deps/examples/ml
creating pyspark-3.0.0.dev0/deps/examples/mllib
creating pyspark-3.0.0.dev0/deps/examples/sql
creating pyspark-3.0.0.dev0/deps/examples/sql/streaming
creating pyspark-3.0.0.dev0/deps/examples/streaming
creating pyspark-3.0.0.dev0/deps/jars
creating pyspark-3.0.0.dev0/deps/licenses
creating pyspark-3.0.0.dev0/deps/sbin
creating pyspark-3.0.0.dev0/lib
creating pyspark-3.0.0.dev0/pyspark
creating pyspark-3.0.0.dev0/pyspark.egg-info
creating pyspark-3.0.0.dev0/pyspark/ml
creating pyspark-3.0.0.dev0/pyspark/ml/linalg
creating pyspark-3.0.0.dev0/pyspark/ml/param
creating pyspark-3.0.0.dev0/pyspark/mllib
creating pyspark-3.0.0.dev0/pyspark/mllib/linalg
creating pyspark-3.0.0.dev0/pyspark/mllib/stat
creating pyspark-3.0.0.dev0/pyspark/python
creating pyspark-3.0.0.dev0/pyspark/python/pyspark
creating pyspark-3.0.0.dev0/pyspark/sql
creating pyspark-3.0.0.dev0/pyspark/streaming
copying files to pyspark-3.0.0.dev0...
copying MANIFEST.in -> pyspark-3.0.0.dev0
copying README.md -> pyspark-3.0.0.dev0
copying setup.cfg -> pyspark-3.0.0.dev0
copying setup.py -> pyspark-3.0.0.dev0
copying deps/bin/beeline -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/beeline.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/docker-image-tool.sh -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/find-spark-home -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/find-spark-home.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/load-spark-env.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/load-spark-env.sh -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/pyspark -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/pyspark.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/pyspark2.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/run-example -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/run-example.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-class -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-class.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-class2.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-shell -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-shell.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-shell2.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-sql -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-sql.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-sql2.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-submit -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-submit.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-submit2.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/sparkR -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/sparkR.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/sparkR2.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/data/graphx/followers.txt -> pyspark-3.0.0.dev0/deps/data/graphx
copying deps/data/graphx/users.txt -> pyspark-3.0.0.dev0/deps/data/graphx
copying deps/data/mllib/gmm_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/iris_libsvm.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/kmeans_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/pagerank_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/pic_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_binary_classification_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_fpgrowth.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_isotonic_regression_libsvm_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_kmeans_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_lda_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_lda_libsvm_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_libsvm_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_linear_regression_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_movielens_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_multiclass_classification_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_svm_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/streaming_kmeans_data_test.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/als/sample_movielens_ratings.txt -> pyspark-3.0.0.dev0/deps/data/mllib/als
copying deps/data/mllib/als/test.data -> pyspark-3.0.0.dev0/deps/data/mllib/als
copying deps/data/mllib/images/license.txt -> pyspark-3.0.0.dev0/deps/data/mllib/images
copying deps/data/mllib/images/origin/license.txt -> pyspark-3.0.0.dev0/deps/data/mllib/images/origin
copying deps/data/mllib/images/origin/kittens/not-image.txt -> pyspark-3.0.0.dev0/deps/data/mllib/images/origin/kittens
copying deps/data/mllib/images/partitioned/cls=kittens/date=2018-01/not-image.txt -> pyspark-3.0.0.dev0/deps/data/mllib/images/partitioned/cls=kittens/date=2018-01
copying deps/data/mllib/ridge-data/lpsa.data -> pyspark-3.0.0.dev0/deps/data/mllib/ridge-data
copying deps/data/streaming/AFINN-111.txt -> pyspark-3.0.0.dev0/deps/data/streaming
copying deps/examples/als.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/avro_inputformat.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/kmeans.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/logistic_regression.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/pagerank.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/parquet_inputformat.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/pi.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/sort.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/status_api_demo.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/transitive_closure.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/wordcount.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/ml/aft_survival_regression.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/als_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/binarizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/bisecting_k_means_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/bucketed_random_projection_lsh_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/bucketizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/chi_square_test_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/chisq_selector_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/correlation_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/count_vectorizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/cross_validator.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/dataframe_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/dct_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/decision_tree_classification_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/decision_tree_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/elementwise_product_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/estimator_transformer_param_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/feature_hasher_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/fpgrowth_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/gaussian_mixture_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/generalized_linear_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/gradient_boosted_tree_classifier_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/gradient_boosted_tree_regressor_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/imputer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/index_to_string_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/interaction_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/isotonic_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/kmeans_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/lda_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/linear_regression_with_elastic_net.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/linearsvc.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/logistic_regression_summary_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/logistic_regression_with_elastic_net.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/max_abs_scaler_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/min_hash_lsh_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/min_max_scaler_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/multiclass_logistic_regression_with_elastic_net.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/multilayer_perceptron_classification.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/n_gram_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/naive_bayes_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/normalizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/one_vs_rest_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/onehot_encoder_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/pca_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/pipeline_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/polynomial_expansion_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/power_iteration_clustering_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/prefixspan_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/quantile_discretizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/random_forest_classifier_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/random_forest_regressor_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/rformula_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/sql_transformer.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/standard_scaler_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/stopwords_remover_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/string_indexer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/summarizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/tf_idf_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/tokenizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/train_validation_split.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_assembler_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_indexer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_size_hint_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_slicer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/word2vec_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/mllib/binary_classification_metrics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/bisecting_k_means_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/correlations.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/correlations_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/decision_tree_classification_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/decision_tree_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/elementwise_product_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/fpgrowth_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gaussian_mixture_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gaussian_mixture_model.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gradient_boosting_classification_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gradient_boosting_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/hypothesis_testing_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/hypothesis_testing_kolmogorov_smirnov_test_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/isotonic_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/k_means_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/kernel_density_estimation_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/kmeans.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/latent_dirichlet_allocation_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/linear_regression_with_sgd_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/logistic_regression.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/logistic_regression_with_lbfgs_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/multi_class_metrics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/multi_label_metrics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/naive_bayes_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/normalizer_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/pca_rowmatrix_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/power_iteration_clustering_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/random_forest_classification_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/random_forest_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/random_rdd_generation.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/ranking_metrics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/recommendation_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/regression_metrics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/sampled_rdds.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/standard_scaler_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/stratified_sampling_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/streaming_k_means_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/streaming_linear_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/summary_statistics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/svd_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/svm_with_sgd_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/tf_idf_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/word2vec.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/word2vec_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/sql/arrow.py -> pyspark-3.0.0.dev0/deps/examples/sql
copying deps/examples/sql/basic.py -> pyspark-3.0.0.dev0/deps/examples/sql
copying deps/examples/sql/datasource.py -> pyspark-3.0.0.dev0/deps/examples/sql
copying deps/examples/sql/hive.py -> pyspark-3.0.0.dev0/deps/examples/sql
copying deps/examples/sql/streaming/structured_kafka_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/sql/streaming
copying deps/examples/sql/streaming/structured_network_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/sql/streaming
copying deps/examples/sql/streaming/structured_network_wordcount_windowed.py -> pyspark-3.0.0.dev0/deps/examples/sql/streaming
copying deps/examples/streaming/hdfs_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/network_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/network_wordjoinsentiments.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/queue_stream.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/recoverable_network_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/sql_network_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/stateful_network_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/jars/JavaEWAH-0.3.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/RoaringBitmap-0.7.45.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/ST4-4.0.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/activation-1.1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/aircompressor-0.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/antlr-2.7.7.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/antlr-runtime-3.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/antlr4-runtime-4.7.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/aopalliance-1.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/aopalliance-repackaged-2.4.0-b34.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/apache-log4j-extras-1.2.17.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/apacheds-i18n-2.0.0-M15.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/apacheds-kerberos-codec-2.0.0-M15.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/api-asn1-api-1.0.0-M20.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/api-util-1.0.0-M20.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/arpack_combined_all-0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/arrow-format-0.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/arrow-memory-0.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/arrow-vector-0.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/automaton-1.11-8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/avro-1.8.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/avro-ipc-1.8.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/avro-mapred-1.8.2-hadoop2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/bonecp-0.8.0.RELEASE.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/breeze-macros_2.12-0.13.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/breeze_2.12-0.13.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/cglib-2.2.1-v20090111.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/chill-java-0.9.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/chill_2.12-0.9.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-beanutils-1.7.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-cli-1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-codec-1.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-collections-3.2.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-compiler-3.0.11.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-compress-1.8.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-configuration-1.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-crypto-1.0.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-dbcp-1.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-digester-1.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-httpclient-3.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-io-2.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-lang-2.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-lang3-3.8.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-logging-1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-math3-3.4.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-net-3.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-pool-1.5.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/compress-lzf-1.0.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/core-1.1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/curator-client-2.7.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/curator-framework-2.7.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/curator-recipes-2.7.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/datanucleus-api-jdo-3.2.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/datanucleus-core-3.2.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/datanucleus-rdbms-3.2.9.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/derby-10.12.1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/flatbuffers-java-1.9.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/generex-1.0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/gmetric4j-1.0.7.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/gson-2.2.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/guava-14.0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/guice-3.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-annotations-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-auth-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-client-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-common-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-hdfs-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-app-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-common-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-core-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-jobclient-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-shuffle-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-api-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-client-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-common-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-server-common-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-server-web-proxy-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hive-beeline-1.2.1.spark2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hive-cli-1.2.1.spark2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hive-exec-1.2.1.spark2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hive-jdbc-1.2.1.spark2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hive-metastore-1.2.1.spark2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hk2-api-2.4.0-b34.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hk2-locator-2.4.0-b34.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hk2-utils-2.4.0-b34.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hppc-0.7.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/htrace-core-3.1.0-incubating.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/httpclient-4.5.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/httpcore-4.4.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/ivy-2.4.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-annotations-2.9.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-core-2.9.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-core-asl-1.9.13.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-databind-2.9.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-dataformat-yaml-2.9.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-jaxrs-1.9.13.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-mapper-asl-1.9.13.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-module-jaxb-annotations-2.9.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-module-paranamer-2.9.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-module-scala_2.12-2.9.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-xc-1.9.13.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/janino-3.0.11.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javassist-3.18.1-GA.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javax.annotation-api-1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javax.inject-1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javax.inject-2.4.0-b34.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javax.servlet-api-3.1.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javax.ws.rs-api-2.0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javolution-5.5.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jaxb-api-2.2.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jcl-over-slf4j-1.7.16.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jdo-api-3.0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-client-2.22.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-common-2.22.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-container-servlet-2.22.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-container-servlet-core-2.22.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-guava-2.22.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-media-jaxb-2.22.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-server-2.22.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jettison-1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-6.1.26.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-client-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-continuation-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-http-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-io-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-jndi-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-plus-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-proxy-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-security-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-server-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-servlet-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-servlets-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-sslengine-6.1.26.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-util-6.1.26.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-util-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-webapp-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-xml-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jline-2.14.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/joda-time-2.9.9.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jodd-core-3.5.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jpam-1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/json4s-ast_2.12-3.5.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/json4s-core_2.12-3.5.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/json4s-jackson_2.12-3.5.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/json4s-scalap_2.12-3.5.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jsp-api-2.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jsr305-3.0.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jta-1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jtransforms-2.4.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jul-to-slf4j-1.7.16.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/kryo-shaded-4.0.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/kubernetes-client-4.1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/kubernetes-model-4.1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/kubernetes-model-common-4.1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/leveldbjni-all-1.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/libfb303-0.9.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/libthrift-0.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/log4j-1.2.17.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/logging-interceptor-3.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/lz4-java-1.6.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/machinist_2.12-0.6.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/macro-compat_2.12-1.1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/mesos-1.4.0-shaded-protobuf.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/metrics-core-3.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/metrics-ganglia-3.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/metrics-graphite-3.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/metrics-json-3.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/metrics-jvm-3.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/minlog-1.3.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/netty-3.9.9.Final.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/netty-all-4.1.30.Final.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/objenesis-2.5.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/okhttp-3.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/okio-1.15.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/oncrpc-1.0.7.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/opencsv-2.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/orc-core-1.5.5-nohive.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/orc-mapreduce-1.5.5-nohive.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/orc-shims-1.5.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/oro-2.0.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/osgi-resource-locator-1.0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/paranamer-2.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-column-1.10.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-common-1.10.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-encoding-1.10.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-format-2.4.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-hadoop-1.10.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-hadoop-bundle-1.6.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-jackson-1.10.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/pmml-model-1.4.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/protobuf-java-2.5.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/py4j-0.10.8.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/pyrolite-4.23.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-compiler-2.12.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-library-2.12.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-parser-combinators_2.12-1.1.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-reflect-2.12.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-xml_2.12-1.0.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/shapeless_2.12-2.3.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/shims-0.7.45.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/slf4j-api-1.7.25.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/slf4j-log4j12-1.7.16.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/snakeyaml-1.23.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/snappy-0.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/snappy-java-1.1.7.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-assembly_2.12-3.0.0-SNAPSHOT-tests.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-assembly_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-catalyst_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-core_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-ganglia-lgpl_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-graphx_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-hive-thriftserver_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-hive_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-kubernetes_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-kvstore_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-launcher_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-mesos_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-mllib-local_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-mllib_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-network-common_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-network-shuffle_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-repl_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-sketch_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-sql_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-streaming_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-tags_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-unsafe_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-yarn_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spire-macros_2.12-0.13.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spire_2.12-0.13.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/stax-api-1.0-2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/stax-api-1.0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/stream-2.9.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/stringtemplate-3.2.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/super-csv-2.2.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/univocity-parsers-2.7.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/unused-1.0.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/validation-api-1.1.0.Final.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/xbean-asm7-shaded-4.13.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/xercesImpl-2.9.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/xml-apis-1.3.04.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/xmlenc-0.52.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/xz-1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/zjsonpatch-0.3.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/zookeeper-3.4.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/zstd-jni-1.4.0-1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/licenses/LICENSE-AnchorJS.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-CC0.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-bootstrap.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-cloudpickle.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-copybutton.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-d3.min.js.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-dagre-d3.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-datatables.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-graphlib-dot.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-heapq.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-join.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-jquery.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-json-formatter.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-matchMedia-polyfill.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-modernizr.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-mustache.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-py4j.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-respond.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-sbt-launch-lib.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-sorttable.js.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-vis.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/sbin/spark-config.sh -> pyspark-3.0.0.dev0/deps/sbin
copying deps/sbin/spark-daemon.sh -> pyspark-3.0.0.dev0/deps/sbin
copying deps/sbin/start-history-server.sh -> pyspark-3.0.0.dev0/deps/sbin
copying deps/sbin/stop-history-server.sh -> pyspark-3.0.0.dev0/deps/sbin
copying lib/py4j-0.10.8.1-src.zip -> pyspark-3.0.0.dev0/lib
copying lib/pyspark.zip -> pyspark-3.0.0.dev0/lib
copying pyspark/__init__.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/_globals.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/accumulators.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/broadcast.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/cloudpickle.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/conf.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/context.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/daemon.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/files.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/find_spark_home.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/heapq3.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/java_gateway.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/join.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/profiler.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/rdd.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/rddsampler.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/resultiterable.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/serializers.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/shell.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/shuffle.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/statcounter.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/status.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/storagelevel.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/taskcontext.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/traceback_utils.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/util.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/version.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/worker.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark.egg-info/PKG-INFO -> pyspark-3.0.0.dev0/pyspark.egg-info
copying pyspark.egg-info/SOURCES.txt -> pyspark-3.0.0.dev0/pyspark.egg-info
copying pyspark.egg-info/dependency_links.txt -> pyspark-3.0.0.dev0/pyspark.egg-info
copying pyspark.egg-info/requires.txt -> pyspark-3.0.0.dev0/pyspark.egg-info
copying pyspark.egg-info/top_level.txt -> pyspark-3.0.0.dev0/pyspark.egg-info
copying pyspark/ml/__init__.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/base.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/classification.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/clustering.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/common.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/evaluation.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/feature.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/fpm.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/image.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/pipeline.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/recommendation.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/regression.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/stat.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/tuning.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/util.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/wrapper.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/linalg/__init__.py -> pyspark-3.0.0.dev0/pyspark/ml/linalg
copying pyspark/ml/param/__init__.py -> pyspark-3.0.0.dev0/pyspark/ml/param
copying pyspark/ml/param/_shared_params_code_gen.py -> pyspark-3.0.0.dev0/pyspark/ml/param
copying pyspark/ml/param/shared.py -> pyspark-3.0.0.dev0/pyspark/ml/param
copying pyspark/mllib/__init__.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/classification.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/clustering.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/common.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/evaluation.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/feature.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/fpm.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/random.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/recommendation.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/regression.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/tree.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/util.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/linalg/__init__.py -> pyspark-3.0.0.dev0/pyspark/mllib/linalg
copying pyspark/mllib/linalg/distributed.py -> pyspark-3.0.0.dev0/pyspark/mllib/linalg
copying pyspark/mllib/stat/KernelDensity.py -> pyspark-3.0.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/__init__.py -> pyspark-3.0.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/_statistics.py -> pyspark-3.0.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/distribution.py -> pyspark-3.0.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/test.py -> pyspark-3.0.0.dev0/pyspark/mllib/stat
copying pyspark/python/pyspark/shell.py -> pyspark-3.0.0.dev0/pyspark/python/pyspark
copying pyspark/sql/__init__.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/catalog.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/column.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/conf.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/context.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/dataframe.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/functions.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/group.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/readwriter.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/session.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/streaming.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/types.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/udf.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/utils.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/window.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/streaming/__init__.py -> pyspark-3.0.0.dev0/pyspark/streaming
copying pyspark/streaming/context.py -> pyspark-3.0.0.dev0/pyspark/streaming
copying pyspark/streaming/dstream.py -> pyspark-3.0.0.dev0/pyspark/streaming
copying pyspark/streaming/kinesis.py -> pyspark-3.0.0.dev0/pyspark/streaming
copying pyspark/streaming/listener.py -> pyspark-3.0.0.dev0/pyspark/streaming
copying pyspark/streaming/util.py -> pyspark-3.0.0.dev0/pyspark/streaming
Writing pyspark-3.0.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'pyspark-3.0.0.dev0' (and everything under it)
Installing dist into virtual env
Processing ./python/dist/pyspark-3.0.0.dev0.tar.gz
Collecting py4j==0.10.8.1 (from pyspark==3.0.0.dev0)
  Downloading https://files.pythonhosted.org/packages/04/de/2d314a921ef4c20b283e1de94e0780273678caac901564df06b948e4ba9b/py4j-0.10.8.1-py2.py3-none-any.whl (196kB)
mkl-random 1.0.1 requires cython, which is not installed.
Installing collected packages: py4j, pyspark
  Running setup.py install for pyspark: started
    Running setup.py install for pyspark: finished with status 'done'
Successfully installed py4j-0.10.8.1 pyspark-3.0.0.dev0
You are using pip version 10.0.1, however version 19.1.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
Run basic sanity check on pip installed version with spark-submit
19/05/20 11:40:19 WARN Utils: Your hostname, amp-jenkins-staging-worker-02 resolves to a loopback address: 127.0.1.1; using 192.168.10.32 instead (on interface eno1)
19/05/20 11:40:19 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
19/05/20 11:40:20 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
19/05/20 11:40:21 INFO SparkContext: Running Spark version 3.0.0-SNAPSHOT
19/05/20 11:40:21 INFO SparkContext: Submitted application: PipSanityCheck
19/05/20 11:40:21 INFO SecurityManager: Changing view acls to: jenkins
19/05/20 11:40:21 INFO SecurityManager: Changing modify acls to: jenkins
19/05/20 11:40:21 INFO SecurityManager: Changing view acls groups to: 
19/05/20 11:40:21 INFO SecurityManager: Changing modify acls groups to: 
19/05/20 11:40:21 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
19/05/20 11:40:21 INFO Utils: Successfully started service 'sparkDriver' on port 42998.
19/05/20 11:40:21 INFO SparkEnv: Registering MapOutputTracker
19/05/20 11:40:21 INFO SparkEnv: Registering BlockManagerMaster
19/05/20 11:40:21 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
19/05/20 11:40:21 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
19/05/20 11:40:22 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-07b796af-13e6-4867-8961-fe7db9b0daea
19/05/20 11:40:22 INFO MemoryStore: MemoryStore started with capacity 366.3 MiB
19/05/20 11:40:22 INFO SparkEnv: Registering OutputCommitCoordinator
19/05/20 11:40:22 INFO log: Logging initialized @3278ms to org.eclipse.jetty.util.log.Slf4jLog
19/05/20 11:40:22 INFO Server: jetty-9.4.18.v20190429; built: 2019-04-29T20:42:08.989Z; git: e1bc35120a6617ee3df052294e433f3a25ce7097; jvm 1.8.0_191-b12
19/05/20 11:40:22 INFO Server: Started @3367ms
19/05/20 11:40:22 INFO AbstractConnector: Started ServerConnector@ec56546{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
19/05/20 11:40:22 INFO Utils: Successfully started service 'SparkUI' on port 4040.
19/05/20 11:40:22 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@6e3b2edd{/jobs,null,AVAILABLE,@Spark}
19/05/20 11:40:22 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@dd479af{/jobs/json,null,AVAILABLE,@Spark}
19/05/20 11:40:22 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@1c7fa20c{/jobs/job,null,AVAILABLE,@Spark}
19/05/20 11:40:22 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@48aef309{/jobs/job/json,null,AVAILABLE,@Spark}
19/05/20 11:40:22 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5a40cc23{/stages,null,AVAILABLE,@Spark}
19/05/20 11:40:22 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@708b65{/stages/json,null,AVAILABLE,@Spark}
19/05/20 11:40:22 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@7738b2f1{/stages/stage,null,AVAILABLE,@Spark}
19/05/20 11:40:22 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@37dae413{/stages/stage/json,null,AVAILABLE,@Spark}
19/05/20 11:40:22 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@4fcb1257{/stages/pool,null,AVAILABLE,@Spark}
19/05/20 11:40:22 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@11a32e77{/stages/pool/json,null,AVAILABLE,@Spark}
19/05/20 11:40:22 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@7f797c89{/storage,null,AVAILABLE,@Spark}
19/05/20 11:40:22 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@4b157b9f{/storage/json,null,AVAILABLE,@Spark}
19/05/20 11:40:22 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@2b4c7e4c{/storage/rdd,null,AVAILABLE,@Spark}
19/05/20 11:40:22 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@61aad903{/storage/rdd/json,null,AVAILABLE,@Spark}
19/05/20 11:40:22 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@18a8065e{/environment,null,AVAILABLE,@Spark}
19/05/20 11:40:22 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@32fad768{/environment/json,null,AVAILABLE,@Spark}
19/05/20 11:40:22 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@4fa305e0{/executors,null,AVAILABLE,@Spark}
19/05/20 11:40:22 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@55f7dfd2{/executors/json,null,AVAILABLE,@Spark}
19/05/20 11:40:22 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@1e8ca24c{/executors/threadDump,null,AVAILABLE,@Spark}
19/05/20 11:40:22 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5465edd0{/executors/threadDump/json,null,AVAILABLE,@Spark}
19/05/20 11:40:22 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5f33d671{/static,null,AVAILABLE,@Spark}
19/05/20 11:40:22 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@4d850e73{/,null,AVAILABLE,@Spark}
19/05/20 11:40:22 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@660795a7{/api,null,AVAILABLE,@Spark}
19/05/20 11:40:22 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@3167b945{/jobs/job/kill,null,AVAILABLE,@Spark}
19/05/20 11:40:22 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@2a892e49{/stages/stage/kill,null,AVAILABLE,@Spark}
19/05/20 11:40:22 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.10.32:4040
19/05/20 11:40:22 INFO Executor: Starting executor ID driver on host localhost
19/05/20 11:40:22 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 44253.
19/05/20 11:40:22 INFO NettyBlockTransferService: Server created on 192.168.10.32:44253
19/05/20 11:40:22 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
19/05/20 11:40:22 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.10.32, 44253, None)
19/05/20 11:40:22 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.10.32:44253 with 366.3 MiB RAM, BlockManagerId(driver, 192.168.10.32, 44253, None)
19/05/20 11:40:22 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.10.32, 44253, None)
19/05/20 11:40:22 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.10.32, 44253, None)
19/05/20 11:40:22 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@1f89757d{/metrics/json,null,AVAILABLE,@Spark}
19/05/20 11:40:23 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/spark-warehouse').
19/05/20 11:40:23 INFO SharedState: Warehouse path is 'file:/spark-warehouse'.
19/05/20 11:40:23 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@632b5984{/SQL,null,AVAILABLE,@Spark}
19/05/20 11:40:23 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@52949d5c{/SQL/json,null,AVAILABLE,@Spark}
19/05/20 11:40:23 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@a4cc97d{/SQL/execution,null,AVAILABLE,@Spark}
19/05/20 11:40:23 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@3e41d4fe{/SQL/execution/json,null,AVAILABLE,@Spark}
19/05/20 11:40:23 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5163236{/static/sql,null,AVAILABLE,@Spark}
19/05/20 11:40:23 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
19/05/20 11:40:24 INFO SparkContext: Starting job: reduce at /home/jenkins/workspace/ubuntuSparkPRB/dev/pip-sanity-check.py:31
19/05/20 11:40:24 INFO DAGScheduler: Got job 0 (reduce at /home/jenkins/workspace/ubuntuSparkPRB/dev/pip-sanity-check.py:31) with 10 output partitions
19/05/20 11:40:24 INFO DAGScheduler: Final stage: ResultStage 0 (reduce at /home/jenkins/workspace/ubuntuSparkPRB/dev/pip-sanity-check.py:31)
19/05/20 11:40:24 INFO DAGScheduler: Parents of final stage: List()
19/05/20 11:40:24 INFO DAGScheduler: Missing parents: List()
19/05/20 11:40:24 INFO DAGScheduler: Submitting ResultStage 0 (PythonRDD[1] at reduce at /home/jenkins/workspace/ubuntuSparkPRB/dev/pip-sanity-check.py:31), which has no missing parents
19/05/20 11:40:24 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 5.9 KiB, free 366.3 MiB)
19/05/20 11:40:24 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 366.3 MiB)
19/05/20 11:40:24 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.10.32:44253 (size: 3.8 KiB, free: 366.3 MiB)
19/05/20 11:40:24 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1180
19/05/20 11:40:24 INFO DAGScheduler: Submitting 10 missing tasks from ResultStage 0 (PythonRDD[1] at reduce at /home/jenkins/workspace/ubuntuSparkPRB/dev/pip-sanity-check.py:31) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7, 8, 9))
19/05/20 11:40:24 INFO TaskSchedulerImpl: Adding task set 0.0 with 10 tasks
19/05/20 11:40:24 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 7333 bytes)
19/05/20 11:40:24 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, localhost, executor driver, partition 1, PROCESS_LOCAL, 7333 bytes)
19/05/20 11:40:24 INFO TaskSetManager: Starting task 2.0 in stage 0.0 (TID 2, localhost, executor driver, partition 2, PROCESS_LOCAL, 7333 bytes)
19/05/20 11:40:24 INFO TaskSetManager: Starting task 3.0 in stage 0.0 (TID 3, localhost, executor driver, partition 3, PROCESS_LOCAL, 7333 bytes)
19/05/20 11:40:24 INFO TaskSetManager: Starting task 4.0 in stage 0.0 (TID 4, localhost, executor driver, partition 4, PROCESS_LOCAL, 7333 bytes)
19/05/20 11:40:24 INFO TaskSetManager: Starting task 5.0 in stage 0.0 (TID 5, localhost, executor driver, partition 5, PROCESS_LOCAL, 7333 bytes)
19/05/20 11:40:24 INFO TaskSetManager: Starting task 6.0 in stage 0.0 (TID 6, localhost, executor driver, partition 6, PROCESS_LOCAL, 7333 bytes)
19/05/20 11:40:24 INFO TaskSetManager: Starting task 7.0 in stage 0.0 (TID 7, localhost, executor driver, partition 7, PROCESS_LOCAL, 7333 bytes)
19/05/20 11:40:24 INFO TaskSetManager: Starting task 8.0 in stage 0.0 (TID 8, localhost, executor driver, partition 8, PROCESS_LOCAL, 7333 bytes)
19/05/20 11:40:24 INFO TaskSetManager: Starting task 9.0 in stage 0.0 (TID 9, localhost, executor driver, partition 9, PROCESS_LOCAL, 7333 bytes)
19/05/20 11:40:24 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
19/05/20 11:40:24 INFO Executor: Running task 1.0 in stage 0.0 (TID 1)
19/05/20 11:40:24 INFO Executor: Running task 2.0 in stage 0.0 (TID 2)
19/05/20 11:40:24 INFO Executor: Running task 3.0 in stage 0.0 (TID 3)
19/05/20 11:40:24 INFO Executor: Running task 4.0 in stage 0.0 (TID 4)
19/05/20 11:40:24 INFO Executor: Running task 5.0 in stage 0.0 (TID 5)
19/05/20 11:40:24 INFO Executor: Running task 6.0 in stage 0.0 (TID 6)
19/05/20 11:40:24 INFO Executor: Running task 7.0 in stage 0.0 (TID 7)
19/05/20 11:40:24 INFO Executor: Running task 8.0 in stage 0.0 (TID 8)
19/05/20 11:40:24 INFO Executor: Running task 9.0 in stage 0.0 (TID 9)
19/05/20 11:40:25 INFO PythonRunner: Times: total = 527, boot = 476, init = 51, finish = 0
19/05/20 11:40:25 INFO PythonRunner: Times: total = 527, boot = 484, init = 43, finish = 0
19/05/20 11:40:25 INFO PythonRunner: Times: total = 527, boot = 471, init = 56, finish = 0
19/05/20 11:40:25 INFO PythonRunner: Times: total = 527, boot = 480, init = 47, finish = 0
19/05/20 11:40:25 INFO PythonRunner: Times: total = 531, boot = 488, init = 43, finish = 0
19/05/20 11:40:25 INFO PythonRunner: Times: total = 532, boot = 493, init = 38, finish = 1
19/05/20 11:40:25 INFO PythonRunner: Times: total = 540, boot = 497, init = 42, finish = 1
19/05/20 11:40:25 INFO PythonRunner: Times: total = 543, boot = 505, init = 38, finish = 0
19/05/20 11:40:25 INFO PythonRunner: Times: total = 543, boot = 501, init = 42, finish = 0
19/05/20 11:40:25 INFO PythonRunner: Times: total = 552, boot = 509, init = 42, finish = 1
19/05/20 11:40:25 INFO Executor: Finished task 2.0 in stage 0.0 (TID 2). 1383 bytes result sent to driver
19/05/20 11:40:25 INFO Executor: Finished task 8.0 in stage 0.0 (TID 8). 1384 bytes result sent to driver
19/05/20 11:40:25 INFO Executor: Finished task 7.0 in stage 0.0 (TID 7). 1384 bytes result sent to driver
19/05/20 11:40:25 INFO Executor: Finished task 6.0 in stage 0.0 (TID 6). 1384 bytes result sent to driver
19/05/20 11:40:25 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 1383 bytes result sent to driver
19/05/20 11:40:25 INFO Executor: Finished task 1.0 in stage 0.0 (TID 1). 1383 bytes result sent to driver
19/05/20 11:40:25 INFO Executor: Finished task 5.0 in stage 0.0 (TID 5). 1384 bytes result sent to driver
19/05/20 11:40:25 INFO Executor: Finished task 9.0 in stage 0.0 (TID 9). 1384 bytes result sent to driver
19/05/20 11:40:25 INFO Executor: Finished task 4.0 in stage 0.0 (TID 4). 1384 bytes result sent to driver
19/05/20 11:40:25 INFO Executor: Finished task 3.0 in stage 0.0 (TID 3). 1384 bytes result sent to driver
19/05/20 11:40:25 INFO TaskSetManager: Finished task 8.0 in stage 0.0 (TID 8) in 1291 ms on localhost (executor driver) (1/10)
19/05/20 11:40:25 INFO TaskSetManager: Finished task 7.0 in stage 0.0 (TID 7) in 1296 ms on localhost (executor driver) (2/10)
19/05/20 11:40:25 INFO TaskSetManager: Finished task 2.0 in stage 0.0 (TID 2) in 1302 ms on localhost (executor driver) (3/10)
19/05/20 11:40:25 INFO TaskSetManager: Finished task 6.0 in stage 0.0 (TID 6) in 1299 ms on localhost (executor driver) (4/10)
19/05/20 11:40:25 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 1364 ms on localhost (executor driver) (5/10)
19/05/20 11:40:25 INFO TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 1306 ms on localhost (executor driver) (6/10)
19/05/20 11:40:25 INFO TaskSetManager: Finished task 5.0 in stage 0.0 (TID 5) in 1302 ms on localhost (executor driver) (7/10)
19/05/20 11:40:25 INFO PythonAccumulatorV2: Connected to AccumulatorServer at host: 127.0.0.1 port: 44813
19/05/20 11:40:25 INFO TaskSetManager: Finished task 9.0 in stage 0.0 (TID 9) in 1302 ms on localhost (executor driver) (8/10)
19/05/20 11:40:25 INFO TaskSetManager: Finished task 4.0 in stage 0.0 (TID 4) in 1306 ms on localhost (executor driver) (9/10)
19/05/20 11:40:25 INFO TaskSetManager: Finished task 3.0 in stage 0.0 (TID 3) in 1308 ms on localhost (executor driver) (10/10)
19/05/20 11:40:25 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
19/05/20 11:40:25 INFO DAGScheduler: ResultStage 0 (reduce at /home/jenkins/workspace/ubuntuSparkPRB/dev/pip-sanity-check.py:31) finished in 1.572 s
19/05/20 11:40:25 INFO DAGScheduler: Job 0 is finished. Cancelling potential speculative or zombie tasks for this job
19/05/20 11:40:25 INFO TaskSchedulerImpl: Killing all running tasks in stage 0: Stage finished
19/05/20 11:40:25 INFO DAGScheduler: Job 0 finished: reduce at /home/jenkins/workspace/ubuntuSparkPRB/dev/pip-sanity-check.py:31, took 1.646611 s
Successfully ran pip sanity check
19/05/20 11:40:25 INFO AbstractConnector: Stopped Spark@ec56546{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
19/05/20 11:40:25 INFO SparkUI: Stopped Spark web UI at http://192.168.10.32:4040
19/05/20 11:40:25 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/05/20 11:40:25 INFO MemoryStore: MemoryStore cleared
19/05/20 11:40:25 INFO BlockManager: BlockManager stopped
19/05/20 11:40:25 INFO BlockManagerMaster: BlockManagerMaster stopped
19/05/20 11:40:25 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/05/20 11:40:25 INFO SparkContext: Successfully stopped SparkContext
19/05/20 11:40:26 INFO ShutdownHookManager: Shutdown hook called
19/05/20 11:40:26 INFO ShutdownHookManager: Deleting directory /tmp/spark-b9fc98d9-3f04-476c-8395-991fca9609f7/pyspark-9b78b3ba-5494-4292-a4e2-7e771915d8cb
19/05/20 11:40:26 INFO ShutdownHookManager: Deleting directory /tmp/spark-4e509a17-9bd8-4592-b1fb-1a95faf21df2
19/05/20 11:40:26 INFO ShutdownHookManager: Deleting directory /tmp/spark-b9fc98d9-3f04-476c-8395-991fca9609f7
Run basic sanity check with import based
19/05/20 11:40:28 WARN Utils: Your hostname, amp-jenkins-staging-worker-02 resolves to a loopback address: 127.0.1.1; using 192.168.10.32 instead (on interface eno1)
19/05/20 11:40:28 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
19/05/20 11:40:29 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

[Stage 0:>                                                        (0 + 10) / 10]
                                                                                
Successfully ran pip sanity check
Run the tests for context.py
19/05/20 11:40:36 WARN Utils: Your hostname, amp-jenkins-staging-worker-02 resolves to a loopback address: 127.0.1.1; using 192.168.10.32 instead (on interface eno1)
19/05/20 11:40:36 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
19/05/20 11:40:37 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
19/05/20 11:40:41 WARN Utils: Your hostname, amp-jenkins-staging-worker-02 resolves to a loopback address: 127.0.1.1; using 192.168.10.32 instead (on interface eno1)
19/05/20 11:40:41 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
19/05/20 11:40:41 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

[Stage 0:>                                                          (0 + 4) / 4]
                                                                                

[Stage 10:>                                                         (0 + 4) / 4]
[Stage 10:>                 (0 + 4) / 4][Stage 11:>                 (0 + 0) / 2]19/05/20 11:40:58 WARN PythonRunner: Incomplete task 3.0 in stage 10 (TID 42) interrupted: Attempting to kill Python Worker
19/05/20 11:40:58 WARN PythonRunner: Incomplete task 1.0 in stage 10 (TID 40) interrupted: Attempting to kill Python Worker
19/05/20 11:40:58 WARN PythonRunner: Incomplete task 2.0 in stage 10 (TID 41) interrupted: Attempting to kill Python Worker
19/05/20 11:40:58 WARN PythonRunner: Incomplete task 0.0 in stage 10 (TID 39) interrupted: Attempting to kill Python Worker
19/05/20 11:40:58 WARN TaskSetManager: Lost task 1.0 in stage 10.0 (TID 40, localhost, executor driver): TaskKilled (Stage cancelled)
19/05/20 11:40:58 WARN TaskSetManager: Lost task 2.0 in stage 10.0 (TID 41, localhost, executor driver): TaskKilled (Stage cancelled)
19/05/20 11:40:58 WARN TaskSetManager: Lost task 3.0 in stage 10.0 (TID 42, localhost, executor driver): TaskKilled (Stage cancelled)
19/05/20 11:40:58 WARN TaskSetManager: Lost task 0.0 in stage 10.0 (TID 39, localhost, executor driver): TaskKilled (Stage cancelled)

                                                                                
Testing pip installation with python 3.5
Using /tmp/tmp.LP1X6uWOrw for virtualenv
Fetching package metadata ...........
Solving package specifications: .

Package plan for installation in environment /tmp/tmp.LP1X6uWOrw/3.5:

The following NEW packages will be INSTALLED:

    blas:            1.0-mkl                
    ca-certificates: 2019.1.23-0            
    certifi:         2018.8.24-py35_1       
    intel-openmp:    2019.3-199             
    libedit:         3.1.20181209-hc058e9b_0
    libffi:          3.2.1-hd88cf55_4       
    libgcc-ng:       8.2.0-hdf63c60_1       
    libgfortran-ng:  7.3.0-hdf63c60_0       
    libstdcxx-ng:    8.2.0-hdf63c60_1       
    mkl:             2018.0.3-1             
    mkl_fft:         1.0.6-py35h7dd41cf_0   
    mkl_random:      1.0.1-py35h4414c95_1   
    ncurses:         6.1-he6710b0_1         
    numpy:           1.15.2-py35h1d66e8a_0  
    numpy-base:      1.15.2-py35h81de0dd_0  
    openssl:         1.0.2r-h7b6447c_0      
    pandas:          0.23.4-py35h04863e7_0  
    pip:             10.0.1-py35_0          
    python:          3.5.6-hc3d631a_0       
    python-dateutil: 2.7.3-py35_0           
    pytz:            2019.1-py_0            
    readline:        7.0-h7b6447c_5         
    setuptools:      40.2.0-py35_0          
    six:             1.11.0-py35_1          
    sqlite:          3.28.0-h7b6447c_0      
    tk:              8.6.8-hbc83047_0       
    wheel:           0.31.1-py35_0          
    xz:              5.2.4-h14c3975_4       
    zlib:            1.2.11-h7b6447c_3      

#
# To activate this environment, use:
# > source activate /tmp/tmp.LP1X6uWOrw/3.5
#
# To deactivate an active environment, use:
# > source deactivate
#

Creating pip installable source dist
running sdist
running egg_info
creating pyspark.egg-info
writing top-level names to pyspark.egg-info/top_level.txt
writing requirements to pyspark.egg-info/requires.txt
writing pyspark.egg-info/PKG-INFO
writing dependency_links to pyspark.egg-info/dependency_links.txt
writing manifest file 'pyspark.egg-info/SOURCES.txt'
Could not import pypandoc - required to package PySpark
package init file 'deps/bin/__init__.py' not found (or not a regular file)
package init file 'deps/sbin/__init__.py' not found (or not a regular file)
package init file 'deps/jars/__init__.py' not found (or not a regular file)
package init file 'pyspark/python/pyspark/__init__.py' not found (or not a regular file)
package init file 'lib/__init__.py' not found (or not a regular file)
package init file 'deps/data/__init__.py' not found (or not a regular file)
package init file 'deps/licenses/__init__.py' not found (or not a regular file)
package init file 'deps/examples/__init__.py' not found (or not a regular file)
reading manifest file 'pyspark.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no previously-included files matching '*.py[cod]' found anywhere in distribution
warning: no previously-included files matching '__pycache__' found anywhere in distribution
warning: no previously-included files matching '.DS_Store' found anywhere in distribution
writing manifest file 'pyspark.egg-info/SOURCES.txt'
running check
creating pyspark-3.0.0.dev0
creating pyspark-3.0.0.dev0/deps
creating pyspark-3.0.0.dev0/deps/bin
creating pyspark-3.0.0.dev0/deps/data
creating pyspark-3.0.0.dev0/deps/data/graphx
creating pyspark-3.0.0.dev0/deps/data/mllib
creating pyspark-3.0.0.dev0/deps/data/mllib/als
creating pyspark-3.0.0.dev0/deps/data/mllib/images
creating pyspark-3.0.0.dev0/deps/data/mllib/images/origin
creating pyspark-3.0.0.dev0/deps/data/mllib/images/origin/kittens
creating pyspark-3.0.0.dev0/deps/data/mllib/images/partitioned
creating pyspark-3.0.0.dev0/deps/data/mllib/images/partitioned/cls=kittens
creating pyspark-3.0.0.dev0/deps/data/mllib/images/partitioned/cls=kittens/date=2018-01
creating pyspark-3.0.0.dev0/deps/data/mllib/ridge-data
creating pyspark-3.0.0.dev0/deps/data/streaming
creating pyspark-3.0.0.dev0/deps/examples
creating pyspark-3.0.0.dev0/deps/examples/ml
creating pyspark-3.0.0.dev0/deps/examples/mllib
creating pyspark-3.0.0.dev0/deps/examples/sql
creating pyspark-3.0.0.dev0/deps/examples/sql/streaming
creating pyspark-3.0.0.dev0/deps/examples/streaming
creating pyspark-3.0.0.dev0/deps/jars
creating pyspark-3.0.0.dev0/deps/licenses
creating pyspark-3.0.0.dev0/deps/sbin
creating pyspark-3.0.0.dev0/lib
creating pyspark-3.0.0.dev0/pyspark
creating pyspark-3.0.0.dev0/pyspark.egg-info
creating pyspark-3.0.0.dev0/pyspark/ml
creating pyspark-3.0.0.dev0/pyspark/ml/linalg
creating pyspark-3.0.0.dev0/pyspark/ml/param
creating pyspark-3.0.0.dev0/pyspark/mllib
creating pyspark-3.0.0.dev0/pyspark/mllib/linalg
creating pyspark-3.0.0.dev0/pyspark/mllib/stat
creating pyspark-3.0.0.dev0/pyspark/python
creating pyspark-3.0.0.dev0/pyspark/python/pyspark
creating pyspark-3.0.0.dev0/pyspark/sql
creating pyspark-3.0.0.dev0/pyspark/streaming
copying files to pyspark-3.0.0.dev0...
copying MANIFEST.in -> pyspark-3.0.0.dev0
copying README.md -> pyspark-3.0.0.dev0
copying setup.cfg -> pyspark-3.0.0.dev0
copying setup.py -> pyspark-3.0.0.dev0
copying deps/bin/beeline -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/beeline.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/docker-image-tool.sh -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/find-spark-home -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/find-spark-home.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/load-spark-env.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/load-spark-env.sh -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/pyspark -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/pyspark.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/pyspark2.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/run-example -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/run-example.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-class -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-class.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-class2.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-shell -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-shell.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-shell2.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-sql -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-sql.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-sql2.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-submit -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-submit.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-submit2.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/sparkR -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/sparkR.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/sparkR2.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/data/graphx/followers.txt -> pyspark-3.0.0.dev0/deps/data/graphx
copying deps/data/graphx/users.txt -> pyspark-3.0.0.dev0/deps/data/graphx
copying deps/data/mllib/gmm_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/iris_libsvm.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/kmeans_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/pagerank_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/pic_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_binary_classification_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_fpgrowth.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_isotonic_regression_libsvm_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_kmeans_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_lda_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_lda_libsvm_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_libsvm_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_linear_regression_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_movielens_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_multiclass_classification_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_svm_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/streaming_kmeans_data_test.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/als/sample_movielens_ratings.txt -> pyspark-3.0.0.dev0/deps/data/mllib/als
copying deps/data/mllib/als/test.data -> pyspark-3.0.0.dev0/deps/data/mllib/als
copying deps/data/mllib/images/license.txt -> pyspark-3.0.0.dev0/deps/data/mllib/images
copying deps/data/mllib/images/origin/license.txt -> pyspark-3.0.0.dev0/deps/data/mllib/images/origin
copying deps/data/mllib/images/origin/kittens/not-image.txt -> pyspark-3.0.0.dev0/deps/data/mllib/images/origin/kittens
copying deps/data/mllib/images/partitioned/cls=kittens/date=2018-01/not-image.txt -> pyspark-3.0.0.dev0/deps/data/mllib/images/partitioned/cls=kittens/date=2018-01
copying deps/data/mllib/ridge-data/lpsa.data -> pyspark-3.0.0.dev0/deps/data/mllib/ridge-data
copying deps/data/streaming/AFINN-111.txt -> pyspark-3.0.0.dev0/deps/data/streaming
copying deps/examples/als.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/avro_inputformat.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/kmeans.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/logistic_regression.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/pagerank.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/parquet_inputformat.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/pi.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/sort.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/status_api_demo.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/transitive_closure.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/wordcount.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/ml/aft_survival_regression.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/als_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/binarizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/bisecting_k_means_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/bucketed_random_projection_lsh_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/bucketizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/chi_square_test_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/chisq_selector_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/correlation_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/count_vectorizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/cross_validator.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/dataframe_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/dct_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/decision_tree_classification_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/decision_tree_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/elementwise_product_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/estimator_transformer_param_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/feature_hasher_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/fpgrowth_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/gaussian_mixture_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/generalized_linear_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/gradient_boosted_tree_classifier_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/gradient_boosted_tree_regressor_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/imputer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/index_to_string_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/interaction_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/isotonic_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/kmeans_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/lda_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/linear_regression_with_elastic_net.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/linearsvc.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/logistic_regression_summary_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/logistic_regression_with_elastic_net.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/max_abs_scaler_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/min_hash_lsh_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/min_max_scaler_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/multiclass_logistic_regression_with_elastic_net.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/multilayer_perceptron_classification.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/n_gram_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/naive_bayes_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/normalizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/one_vs_rest_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/onehot_encoder_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/pca_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/pipeline_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/polynomial_expansion_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/power_iteration_clustering_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/prefixspan_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/quantile_discretizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/random_forest_classifier_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/random_forest_regressor_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/rformula_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/sql_transformer.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/standard_scaler_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/stopwords_remover_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/string_indexer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/summarizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/tf_idf_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/tokenizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/train_validation_split.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_assembler_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_indexer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_size_hint_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_slicer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/word2vec_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/mllib/binary_classification_metrics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/bisecting_k_means_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/correlations.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/correlations_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/decision_tree_classification_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/decision_tree_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/elementwise_product_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/fpgrowth_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gaussian_mixture_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gaussian_mixture_model.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gradient_boosting_classification_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gradient_boosting_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/hypothesis_testing_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/hypothesis_testing_kolmogorov_smirnov_test_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/isotonic_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/k_means_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/kernel_density_estimation_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/kmeans.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/latent_dirichlet_allocation_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/linear_regression_with_sgd_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/logistic_regression.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/logistic_regression_with_lbfgs_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/multi_class_metrics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/multi_label_metrics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/naive_bayes_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/normalizer_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/pca_rowmatrix_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/power_iteration_clustering_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/random_forest_classification_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/random_forest_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/random_rdd_generation.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/ranking_metrics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/recommendation_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/regression_metrics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/sampled_rdds.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/standard_scaler_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/stratified_sampling_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/streaming_k_means_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/streaming_linear_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/summary_statistics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/svd_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/svm_with_sgd_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/tf_idf_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/word2vec.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/word2vec_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/sql/arrow.py -> pyspark-3.0.0.dev0/deps/examples/sql
copying deps/examples/sql/basic.py -> pyspark-3.0.0.dev0/deps/examples/sql
copying deps/examples/sql/datasource.py -> pyspark-3.0.0.dev0/deps/examples/sql
copying deps/examples/sql/hive.py -> pyspark-3.0.0.dev0/deps/examples/sql
copying deps/examples/sql/streaming/structured_kafka_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/sql/streaming
copying deps/examples/sql/streaming/structured_network_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/sql/streaming
copying deps/examples/sql/streaming/structured_network_wordcount_windowed.py -> pyspark-3.0.0.dev0/deps/examples/sql/streaming
copying deps/examples/streaming/hdfs_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/network_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/network_wordjoinsentiments.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/queue_stream.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/recoverable_network_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/sql_network_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/stateful_network_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/jars/JavaEWAH-0.3.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/RoaringBitmap-0.7.45.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/ST4-4.0.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/activation-1.1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/aircompressor-0.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/antlr-2.7.7.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/antlr-runtime-3.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/antlr4-runtime-4.7.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/aopalliance-1.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/aopalliance-repackaged-2.4.0-b34.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/apache-log4j-extras-1.2.17.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/apacheds-i18n-2.0.0-M15.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/apacheds-kerberos-codec-2.0.0-M15.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/api-asn1-api-1.0.0-M20.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/api-util-1.0.0-M20.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/arpack_combined_all-0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/arrow-format-0.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/arrow-memory-0.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/arrow-vector-0.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/automaton-1.11-8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/avro-1.8.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/avro-ipc-1.8.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/avro-mapred-1.8.2-hadoop2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/bonecp-0.8.0.RELEASE.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/breeze-macros_2.12-0.13.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/breeze_2.12-0.13.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/cglib-2.2.1-v20090111.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/chill-java-0.9.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/chill_2.12-0.9.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-beanutils-1.7.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-cli-1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-codec-1.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-collections-3.2.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-compiler-3.0.11.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-compress-1.8.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-configuration-1.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-crypto-1.0.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-dbcp-1.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-digester-1.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-httpclient-3.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-io-2.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-lang-2.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-lang3-3.8.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-logging-1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-math3-3.4.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-net-3.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-pool-1.5.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/compress-lzf-1.0.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/core-1.1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/curator-client-2.7.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/curator-framework-2.7.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/curator-recipes-2.7.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/datanucleus-api-jdo-3.2.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/datanucleus-core-3.2.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/datanucleus-rdbms-3.2.9.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/derby-10.12.1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/flatbuffers-java-1.9.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/generex-1.0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/gmetric4j-1.0.7.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/gson-2.2.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/guava-14.0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/guice-3.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-annotations-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-auth-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-client-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-common-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-hdfs-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-app-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-common-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-core-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-jobclient-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-shuffle-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-api-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-client-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-common-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-server-common-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-server-web-proxy-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hive-beeline-1.2.1.spark2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hive-cli-1.2.1.spark2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hive-exec-1.2.1.spark2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hive-jdbc-1.2.1.spark2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hive-metastore-1.2.1.spark2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hk2-api-2.4.0-b34.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hk2-locator-2.4.0-b34.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hk2-utils-2.4.0-b34.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hppc-0.7.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/htrace-core-3.1.0-incubating.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/httpclient-4.5.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/httpcore-4.4.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/ivy-2.4.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-annotations-2.9.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-core-2.9.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-core-asl-1.9.13.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-databind-2.9.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-dataformat-yaml-2.9.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-jaxrs-1.9.13.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-mapper-asl-1.9.13.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-module-jaxb-annotations-2.9.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-module-paranamer-2.9.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-module-scala_2.12-2.9.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-xc-1.9.13.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/janino-3.0.11.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javassist-3.18.1-GA.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javax.annotation-api-1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javax.inject-1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javax.inject-2.4.0-b34.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javax.servlet-api-3.1.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javax.ws.rs-api-2.0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javolution-5.5.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jaxb-api-2.2.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jcl-over-slf4j-1.7.16.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jdo-api-3.0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-client-2.22.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-common-2.22.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-container-servlet-2.22.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-container-servlet-core-2.22.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-guava-2.22.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-media-jaxb-2.22.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-server-2.22.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jettison-1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-6.1.26.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-client-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-continuation-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-http-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-io-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-jndi-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-plus-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-proxy-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-security-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-server-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-servlet-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-servlets-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-sslengine-6.1.26.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-util-6.1.26.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-util-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-webapp-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-xml-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jline-2.14.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/joda-time-2.9.9.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jodd-core-3.5.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jpam-1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/json4s-ast_2.12-3.5.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/json4s-core_2.12-3.5.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/json4s-jackson_2.12-3.5.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/json4s-scalap_2.12-3.5.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jsp-api-2.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jsr305-3.0.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jta-1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jtransforms-2.4.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jul-to-slf4j-1.7.16.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/kryo-shaded-4.0.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/kubernetes-client-4.1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/kubernetes-model-4.1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/kubernetes-model-common-4.1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/leveldbjni-all-1.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/libfb303-0.9.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/libthrift-0.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/log4j-1.2.17.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/logging-interceptor-3.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/lz4-java-1.6.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/machinist_2.12-0.6.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/macro-compat_2.12-1.1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/mesos-1.4.0-shaded-protobuf.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/metrics-core-3.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/metrics-ganglia-3.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/metrics-graphite-3.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/metrics-json-3.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/metrics-jvm-3.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/minlog-1.3.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/netty-3.9.9.Final.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/netty-all-4.1.30.Final.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/objenesis-2.5.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/okhttp-3.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/okio-1.15.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/oncrpc-1.0.7.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/opencsv-2.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/orc-core-1.5.5-nohive.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/orc-mapreduce-1.5.5-nohive.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/orc-shims-1.5.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/oro-2.0.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/osgi-resource-locator-1.0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/paranamer-2.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-column-1.10.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-common-1.10.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-encoding-1.10.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-format-2.4.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-hadoop-1.10.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-hadoop-bundle-1.6.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-jackson-1.10.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/pmml-model-1.4.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/protobuf-java-2.5.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/py4j-0.10.8.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/pyrolite-4.23.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-compiler-2.12.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-library-2.12.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-parser-combinators_2.12-1.1.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-reflect-2.12.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-xml_2.12-1.0.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/shapeless_2.12-2.3.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/shims-0.7.45.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/slf4j-api-1.7.25.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/slf4j-log4j12-1.7.16.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/snakeyaml-1.23.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/snappy-0.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/snappy-java-1.1.7.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-assembly_2.12-3.0.0-SNAPSHOT-tests.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-assembly_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-catalyst_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-core_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-ganglia-lgpl_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-graphx_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-hive-thriftserver_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-hive_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-kubernetes_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-kvstore_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-launcher_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-mesos_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-mllib-local_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-mllib_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-network-common_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-network-shuffle_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-repl_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-sketch_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-sql_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-streaming_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-tags_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-unsafe_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-yarn_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spire-macros_2.12-0.13.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spire_2.12-0.13.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/stax-api-1.0-2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/stax-api-1.0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/stream-2.9.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/stringtemplate-3.2.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/super-csv-2.2.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/univocity-parsers-2.7.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/unused-1.0.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/validation-api-1.1.0.Final.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/xbean-asm7-shaded-4.13.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/xercesImpl-2.9.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/xml-apis-1.3.04.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/xmlenc-0.52.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/xz-1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/zjsonpatch-0.3.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/zookeeper-3.4.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/zstd-jni-1.4.0-1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/licenses/LICENSE-AnchorJS.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-CC0.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-bootstrap.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-cloudpickle.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-copybutton.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-d3.min.js.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-dagre-d3.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-datatables.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-graphlib-dot.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-heapq.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-join.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-jquery.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-json-formatter.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-matchMedia-polyfill.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-modernizr.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-mustache.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-py4j.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-respond.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-sbt-launch-lib.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-sorttable.js.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-vis.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/sbin/spark-config.sh -> pyspark-3.0.0.dev0/deps/sbin
copying deps/sbin/spark-daemon.sh -> pyspark-3.0.0.dev0/deps/sbin
copying deps/sbin/start-history-server.sh -> pyspark-3.0.0.dev0/deps/sbin
copying deps/sbin/stop-history-server.sh -> pyspark-3.0.0.dev0/deps/sbin
copying lib/py4j-0.10.8.1-src.zip -> pyspark-3.0.0.dev0/lib
copying lib/pyspark.zip -> pyspark-3.0.0.dev0/lib
copying pyspark/__init__.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/_globals.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/accumulators.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/broadcast.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/cloudpickle.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/conf.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/context.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/daemon.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/files.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/find_spark_home.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/heapq3.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/java_gateway.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/join.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/profiler.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/rdd.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/rddsampler.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/resultiterable.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/serializers.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/shell.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/shuffle.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/statcounter.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/status.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/storagelevel.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/taskcontext.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/traceback_utils.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/util.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/version.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/worker.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark.egg-info/PKG-INFO -> pyspark-3.0.0.dev0/pyspark.egg-info
copying pyspark.egg-info/SOURCES.txt -> pyspark-3.0.0.dev0/pyspark.egg-info
copying pyspark.egg-info/dependency_links.txt -> pyspark-3.0.0.dev0/pyspark.egg-info
copying pyspark.egg-info/requires.txt -> pyspark-3.0.0.dev0/pyspark.egg-info
copying pyspark.egg-info/top_level.txt -> pyspark-3.0.0.dev0/pyspark.egg-info
copying pyspark/ml/__init__.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/base.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/classification.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/clustering.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/common.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/evaluation.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/feature.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/fpm.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/image.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/pipeline.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/recommendation.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/regression.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/stat.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/tuning.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/util.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/wrapper.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/linalg/__init__.py -> pyspark-3.0.0.dev0/pyspark/ml/linalg
copying pyspark/ml/param/__init__.py -> pyspark-3.0.0.dev0/pyspark/ml/param
copying pyspark/ml/param/_shared_params_code_gen.py -> pyspark-3.0.0.dev0/pyspark/ml/param
copying pyspark/ml/param/shared.py -> pyspark-3.0.0.dev0/pyspark/ml/param
copying pyspark/mllib/__init__.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/classification.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/clustering.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/common.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/evaluation.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/feature.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/fpm.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/random.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/recommendation.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/regression.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/tree.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/util.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/linalg/__init__.py -> pyspark-3.0.0.dev0/pyspark/mllib/linalg
copying pyspark/mllib/linalg/distributed.py -> pyspark-3.0.0.dev0/pyspark/mllib/linalg
copying pyspark/mllib/stat/KernelDensity.py -> pyspark-3.0.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/__init__.py -> pyspark-3.0.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/_statistics.py -> pyspark-3.0.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/distribution.py -> pyspark-3.0.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/test.py -> pyspark-3.0.0.dev0/pyspark/mllib/stat
copying pyspark/python/pyspark/shell.py -> pyspark-3.0.0.dev0/pyspark/python/pyspark
copying pyspark/sql/__init__.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/catalog.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/column.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/conf.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/context.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/dataframe.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/functions.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/group.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/readwriter.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/session.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/streaming.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/types.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/udf.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/utils.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/window.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/streaming/__init__.py -> pyspark-3.0.0.dev0/pyspark/streaming
copying pyspark/streaming/context.py -> pyspark-3.0.0.dev0/pyspark/streaming
copying pyspark/streaming/dstream.py -> pyspark-3.0.0.dev0/pyspark/streaming
copying pyspark/streaming/kinesis.py -> pyspark-3.0.0.dev0/pyspark/streaming
copying pyspark/streaming/listener.py -> pyspark-3.0.0.dev0/pyspark/streaming
copying pyspark/streaming/util.py -> pyspark-3.0.0.dev0/pyspark/streaming
Writing pyspark-3.0.0.dev0/setup.cfg
Creating tar archive
removing 'pyspark-3.0.0.dev0' (and everything under it)
Installing dist into virtual env
Obtaining file:///home/jenkins/workspace/ubuntuSparkPRB/python
Collecting py4j==0.10.8.1 (from pyspark==3.0.0.dev0)
  Downloading https://files.pythonhosted.org/packages/04/de/2d314a921ef4c20b283e1de94e0780273678caac901564df06b948e4ba9b/py4j-0.10.8.1-py2.py3-none-any.whl (196kB)
mkl-random 1.0.1 requires cython, which is not installed.
Installing collected packages: py4j, pyspark
  Running setup.py develop for pyspark
Successfully installed py4j-0.10.8.1 pyspark
You are using pip version 10.0.1, however version 19.1.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
Run basic sanity check on pip installed version with spark-submit
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: target/unit-tests.log (No such file or directory)
	at java.io.FileOutputStream.open0(Native Method)
	at java.io.FileOutputStream.open(FileOutputStream.java:270)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
	at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
	at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
	at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
	at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
	at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
	at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
	at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
	at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
	at org.apache.spark.internal.Logging.initializeLogging(Logging.scala:123)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary(Logging.scala:111)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary$(Logging.scala:105)
	at org.apache.spark.deploy.SparkSubmit.initializeLogIfNecessary(SparkSubmit.scala:73)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:81)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:949)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:958)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Successfully ran pip sanity check
Run basic sanity check with import based
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: target/unit-tests.log (No such file or directory)
	at java.io.FileOutputStream.open0(Native Method)
	at java.io.FileOutputStream.open(FileOutputStream.java:270)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
	at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
	at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
	at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
	at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
	at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
	at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
	at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
	at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
	at org.apache.spark.internal.Logging.initializeLogging(Logging.scala:123)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary(Logging.scala:111)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary$(Logging.scala:105)
	at org.apache.spark.deploy.SparkSubmit.initializeLogIfNecessary(SparkSubmit.scala:73)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:81)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:949)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:958)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

[Stage 0:>                                                        (0 + 10) / 10]
                                                                                
Successfully ran pip sanity check
Run the tests for context.py
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: target/unit-tests.log (No such file or directory)
	at java.io.FileOutputStream.open0(Native Method)
	at java.io.FileOutputStream.open(FileOutputStream.java:270)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
	at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
	at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
	at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
	at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
	at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
	at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
	at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
	at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
	at org.apache.spark.internal.Logging.initializeLogging(Logging.scala:123)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary(Logging.scala:111)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary$(Logging.scala:105)
	at org.apache.spark.deploy.SparkSubmit.initializeLogIfNecessary(SparkSubmit.scala:73)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:81)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:949)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:958)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: target/unit-tests.log (No such file or directory)
	at java.io.FileOutputStream.open0(Native Method)
	at java.io.FileOutputStream.open(FileOutputStream.java:270)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
	at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
	at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
	at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
	at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
	at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
	at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
	at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
	at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
	at org.apache.spark.internal.Logging.initializeLogging(Logging.scala:123)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary(Logging.scala:111)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary$(Logging.scala:105)
	at org.apache.spark.deploy.SparkSubmit.initializeLogIfNecessary(SparkSubmit.scala:73)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:81)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:949)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:958)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

[Stage 0:>                                                          (0 + 4) / 4]
                                                                                

[Stage 10:>                                                         (0 + 4) / 4]
[Stage 10:>                 (0 + 4) / 4][Stage 11:>                 (0 + 0) / 2]
                                                                                
Cleaning up temporary directory - /tmp/tmp.LP1X6uWOrw

========================================================================
Running SparkR tests
========================================================================
During startup - Warning message:
In .First() :
  Support for R prior to version 3.4 is deprecated since Spark 3.0.0
Loading required package: methods

Attaching package: ‘SparkR’

The following objects are masked from ‘package:testthat’:

    describe, not

The following objects are masked from ‘package:stats’:

    cov, filter, lag, na.omit, predict, sd, var, window

The following objects are masked from ‘package:base’:

    as.data.frame, colnames, colnames<-, drop, intersect, rank, rbind,
    sample, subset, summary, transform, union

Spark package found in SPARK_HOME: /home/jenkins/workspace/ubuntuSparkPRB
basic tests for CRAN: .............

DONE ===========================================================================
functions on binary files: Spark package found in SPARK_HOME: /home/jenkins/workspace/ubuntuSparkPRB
....
binary functions: Spark package found in SPARK_HOME: /home/jenkins/workspace/ubuntuSparkPRB
...........
broadcast variables: Spark package found in SPARK_HOME: /home/jenkins/workspace/ubuntuSparkPRB
..
functions in client.R: .....
test functions in sparkR.R: ..........................................
include R packages: Spark package found in SPARK_HOME: /home/jenkins/workspace/ubuntuSparkPRB

JVM API: Spark package found in SPARK_HOME: /home/jenkins/workspace/ubuntuSparkPRB
..
MLlib classification algorithms, except for tree-based algorithms: Spark package found in SPARK_HOME: /home/jenkins/workspace/ubuntuSparkPRB
......................................................................
MLlib clustering algorithms: Spark package found in SPARK_HOME: /home/jenkins/workspace/ubuntuSparkPRB
......................................................................
MLlib frequent pattern mining: Spark package found in SPARK_HOME: /home/jenkins/workspace/ubuntuSparkPRB
......
MLlib recommendation algorithms: Spark package found in SPARK_HOME: /home/jenkins/workspace/ubuntuSparkPRB
........
MLlib regression algorithms, except for tree-based algorithms: Spark package found in SPARK_HOME: /home/jenkins/workspace/ubuntuSparkPRB
................................................................................................................................
MLlib statistics algorithms: Spark package found in SPARK_HOME: /home/jenkins/workspace/ubuntuSparkPRB
........
MLlib tree-based algorithms: Spark package found in SPARK_HOME: /home/jenkins/workspace/ubuntuSparkPRB
..............................................................................................
parallelize() and collect(): Spark package found in SPARK_HOME: /home/jenkins/workspace/ubuntuSparkPRB
.............................
basic RDD functions: Spark package found in SPARK_HOME: /home/jenkins/workspace/ubuntuSparkPRB
............................................................................................................................................................................................................................................................................................................................................................................................................................................
SerDe functionality: Spark package found in SPARK_HOME: /home/jenkins/workspace/ubuntuSparkPRB
.......................................
partitionBy, groupByKey, reduceByKey etc.: Spark package found in SPARK_HOME: /home/jenkins/workspace/ubuntuSparkPRB
....................
functions in sparkR.R: ....
SparkSQL Arrow optimization: Spark package found in SPARK_HOME: /home/jenkins/workspace/ubuntuSparkPRB
SSSSSSSSSS
test show SparkDataFrame when eager execution is enabled.: ......
SparkSQL functions: Spark package found in SPARK_HOME: /home/jenkins/workspace/ubuntuSparkPRB
................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................
Structured Streaming: Spark package found in SPARK_HOME: /home/jenkins/workspace/ubuntuSparkPRB
..........................................
tests RDD function take(): Spark package found in SPARK_HOME: /home/jenkins/workspace/ubuntuSparkPRB
................
the textFile() function: Spark package found in SPARK_HOME: /home/jenkins/workspace/ubuntuSparkPRB
.............
functions in utils.R: Spark package found in SPARK_HOME: /home/jenkins/workspace/ubuntuSparkPRB
.............................................
Windows-specific tests: S

Skipped ------------------------------------------------------------------------
1. createDataFrame/collect Arrow optimization (@test_sparkSQL_arrow.R#25) - arrow not installed

2. createDataFrame/collect Arrow optimization - many partitions (partition order test) (@test_sparkSQL_arrow.R#48) - arrow not installed

3. createDataFrame/collect Arrow optimization - type specification (@test_sparkSQL_arrow.R#64) - arrow not installed

4. dapply() Arrow optimization (@test_sparkSQL_arrow.R#94) - arrow not installed

5. dapply() Arrow optimization - type specification (@test_sparkSQL_arrow.R#134) - arrow not installed

6. dapply() Arrow optimization - type specification (date and timestamp) (@test_sparkSQL_arrow.R#169) - arrow not installed

7. gapply() Arrow optimization (@test_sparkSQL_arrow.R#188) - arrow not installed

8. gapply() Arrow optimization - type specification (@test_sparkSQL_arrow.R#237) - arrow not installed

9. gapply() Arrow optimization - type specification (date and timestamp) (@test_sparkSQL_arrow.R#276) - arrow not installed

10. Arrow optimization - unsupported types (@test_sparkSQL_arrow.R#297) - arrow not installed

11. sparkJars tag in SparkContext (@test_Windows.R#22) - This test is only for Windows, skipped

DONE ===========================================================================
Using R_SCRIPT_PATH = /usr/bin
++++ dirname /home/jenkins/workspace/ubuntuSparkPRB/R/install-dev.sh
+++ cd /home/jenkins/workspace/ubuntuSparkPRB/R
+++ pwd
++ FWDIR=/home/jenkins/workspace/ubuntuSparkPRB/R
++ LIB_DIR=/home/jenkins/workspace/ubuntuSparkPRB/R/lib
++ mkdir -p /home/jenkins/workspace/ubuntuSparkPRB/R/lib
++ pushd /home/jenkins/workspace/ubuntuSparkPRB/R
++ . /home/jenkins/workspace/ubuntuSparkPRB/R/find-r.sh
+++ '[' -z /usr/bin ']'
++ . /home/jenkins/workspace/ubuntuSparkPRB/R/create-rd.sh
+++ set -o pipefail
+++ set -e
+++++ dirname /home/jenkins/workspace/ubuntuSparkPRB/R/create-rd.sh
++++ cd /home/jenkins/workspace/ubuntuSparkPRB/R
++++ pwd
+++ FWDIR=/home/jenkins/workspace/ubuntuSparkPRB/R
+++ pushd /home/jenkins/workspace/ubuntuSparkPRB/R
+++ . /home/jenkins/workspace/ubuntuSparkPRB/R/find-r.sh
++++ '[' -z /usr/bin ']'
+++ /usr/bin/Rscript -e ' if("devtools" %in% rownames(installed.packages())) { library(devtools); devtools::document(pkg="./pkg", roclets=c("rd")) }'
Updating SparkR documentation
Loading SparkR
Loading required package: methods
Creating a new generic function for ‘as.data.frame’ in package ‘SparkR’
Creating a new generic function for ‘colnames’ in package ‘SparkR’
Creating a new generic function for ‘colnames<-’ in package ‘SparkR’
Creating a new generic function for ‘cov’ in package ‘SparkR’
Creating a new generic function for ‘drop’ in package ‘SparkR’
Creating a new generic function for ‘na.omit’ in package ‘SparkR’
Creating a new generic function for ‘filter’ in package ‘SparkR’
Creating a new generic function for ‘intersect’ in package ‘SparkR’
Creating a new generic function for ‘sample’ in package ‘SparkR’
Creating a new generic function for ‘transform’ in package ‘SparkR’
Creating a new generic function for ‘subset’ in package ‘SparkR’
Creating a new generic function for ‘summary’ in package ‘SparkR’
Creating a new generic function for ‘union’ in package ‘SparkR’
Creating a new generic function for ‘lag’ in package ‘SparkR’
Creating a new generic function for ‘rank’ in package ‘SparkR’
Creating a new generic function for ‘sd’ in package ‘SparkR’
Creating a new generic function for ‘var’ in package ‘SparkR’
Creating a new generic function for ‘window’ in package ‘SparkR’
Creating a new generic function for ‘predict’ in package ‘SparkR’
Creating a new generic function for ‘rbind’ in package ‘SparkR’
Creating a generic function for ‘alias’ from package ‘stats’ in package ‘SparkR’
Creating a generic function for ‘substr’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘%in%’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘mean’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘lapply’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘Filter’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘unique’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘nrow’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘ncol’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘head’ from package ‘utils’ in package ‘SparkR’
Creating a generic function for ‘factorial’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘atan2’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘ifelse’ from package ‘base’ in package ‘SparkR’
++ /usr/bin/R CMD INSTALL --library=/home/jenkins/workspace/ubuntuSparkPRB/R/lib /home/jenkins/workspace/ubuntuSparkPRB/R/pkg/
* installing *source* package ‘SparkR’ ...
** R
** inst
** preparing package for lazy loading
Creating a new generic function for ‘as.data.frame’ in package ‘SparkR’
Creating a new generic function for ‘colnames’ in package ‘SparkR’
Creating a new generic function for ‘colnames<-’ in package ‘SparkR’
Creating a new generic function for ‘cov’ in package ‘SparkR’
Creating a new generic function for ‘drop’ in package ‘SparkR’
Creating a new generic function for ‘na.omit’ in package ‘SparkR’
Creating a new generic function for ‘filter’ in package ‘SparkR’
Creating a new generic function for ‘intersect’ in package ‘SparkR’
Creating a new generic function for ‘sample’ in package ‘SparkR’
Creating a new generic function for ‘transform’ in package ‘SparkR’
Creating a new generic function for ‘subset’ in package ‘SparkR’
Creating a new generic function for ‘summary’ in package ‘SparkR’
Creating a new generic function for ‘union’ in package ‘SparkR’
Creating a new generic function for ‘lag’ in package ‘SparkR’
Creating a new generic function for ‘rank’ in package ‘SparkR’
Creating a new generic function for ‘sd’ in package ‘SparkR’
Creating a new generic function for ‘var’ in package ‘SparkR’
Creating a new generic function for ‘window’ in package ‘SparkR’
Creating a new generic function for ‘predict’ in package ‘SparkR’
Creating a new generic function for ‘rbind’ in package ‘SparkR’
Creating a generic function for ‘alias’ from package ‘stats’ in package ‘SparkR’
Creating a generic function for ‘substr’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘%in%’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘mean’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘lapply’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘Filter’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘unique’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘nrow’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘ncol’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘head’ from package ‘utils’ in package ‘SparkR’
Creating a generic function for ‘factorial’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘atan2’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘ifelse’ from package ‘base’ in package ‘SparkR’
** help
*** installing help indices
** building package indices
** installing vignettes
** testing if installed package can be loaded
* DONE (SparkR)
++ cd /home/jenkins/workspace/ubuntuSparkPRB/R/lib
++ jar cfM /home/jenkins/workspace/ubuntuSparkPRB/R/lib/sparkr.zip SparkR
++ popd
++ cd /home/jenkins/workspace/ubuntuSparkPRB/R/..
++ pwd
+ SPARK_HOME=/home/jenkins/workspace/ubuntuSparkPRB
+ . /home/jenkins/workspace/ubuntuSparkPRB/bin/load-spark-env.sh
++ '[' -z /home/jenkins/workspace/ubuntuSparkPRB ']'
++ SPARK_ENV_SH=spark-env.sh
++ '[' -z '' ']'
++ export SPARK_ENV_LOADED=1
++ SPARK_ENV_LOADED=1
++ export SPARK_CONF_DIR=/home/jenkins/workspace/ubuntuSparkPRB/conf
++ SPARK_CONF_DIR=/home/jenkins/workspace/ubuntuSparkPRB/conf
++ SPARK_ENV_SH=/home/jenkins/workspace/ubuntuSparkPRB/conf/spark-env.sh
++ [[ -f /home/jenkins/workspace/ubuntuSparkPRB/conf/spark-env.sh ]]
++ export SPARK_SCALA_VERSION=2.12
++ SPARK_SCALA_VERSION=2.12
+ '[' -f /home/jenkins/workspace/ubuntuSparkPRB/RELEASE ']'
+ SPARK_JARS_DIR=/home/jenkins/workspace/ubuntuSparkPRB/assembly/target/scala-2.12/jars
+ '[' -d /home/jenkins/workspace/ubuntuSparkPRB/assembly/target/scala-2.12/jars ']'
+ SPARK_HOME=/home/jenkins/workspace/ubuntuSparkPRB
+ /usr/bin/R CMD build /home/jenkins/workspace/ubuntuSparkPRB/R/pkg
* checking for file ‘/home/jenkins/workspace/ubuntuSparkPRB/R/pkg/DESCRIPTION’ ... OK
* preparing ‘SparkR’:
* checking DESCRIPTION meta-information ... OK
* installing the package to build vignettes
* creating vignettes ... OK
* checking for LF line-endings in source and make files
* checking for empty or unneeded directories
* building ‘SparkR_3.0.0.tar.gz’

+ find pkg/vignettes/. -not -name . -not -name '*.Rmd' -not -name '*.md' -not -name '*.pdf' -not -name '*.html' -delete
++ grep Version /home/jenkins/workspace/ubuntuSparkPRB/R/pkg/DESCRIPTION
++ awk '{print $NF}'
+ VERSION=3.0.0
+ CRAN_CHECK_OPTIONS=--as-cran
+ '[' -n 1 ']'
+ CRAN_CHECK_OPTIONS='--as-cran --no-tests'
+ '[' -n 1 ']'
+ CRAN_CHECK_OPTIONS='--as-cran --no-tests --no-manual --no-vignettes'
+ echo 'Running CRAN check with --as-cran --no-tests --no-manual --no-vignettes options'
Running CRAN check with --as-cran --no-tests --no-manual --no-vignettes options
+ '[' -n 1 ']'
+ '[' -n 1 ']'
+ /usr/bin/R CMD check --as-cran --no-tests --no-manual --no-vignettes SparkR_3.0.0.tar.gz
* using log directory ‘/home/jenkins/workspace/ubuntuSparkPRB/R/SparkR.Rcheck’
* using R version 3.2.3 (2015-12-10)
* using platform: x86_64-pc-linux-gnu (64-bit)
* using session charset: UTF-8
* using options ‘--no-tests --no-manual --no-vignettes --as-cran’
* checking for file ‘SparkR/DESCRIPTION’ ... OK
* checking extension type ... Package
* this is package ‘SparkR’ version ‘3.0.0’
* checking CRAN incoming feasibility ... Note_to_CRAN_maintainers
Maintainer: ‘Shivaram Venkataraman <shivaram@cs.berkeley.edu>’
* checking package namespace information ... OK
* checking package dependencies ... NOTE
  No repository set, so cyclic dependency check skipped
* checking if this is a source package ... OK
* checking if there is a namespace ... OK
* checking for executable files ... OK
* checking for hidden files and directories ... OK
* checking for portable file names ... OK
* checking for sufficient/correct file permissions ... OK
* checking whether package ‘SparkR’ can be installed ... OK
* checking installed package size ... OK
* checking package directory ... OK
* checking ‘build’ directory ... OK
* checking DESCRIPTION meta-information ... OK
* checking top-level files ... OK
* checking for left-over files ... OK
* checking index information ... OK
* checking package subdirectories ... OK
* checking R files for non-ASCII characters ... OK
* checking R files for syntax errors ... OK
* checking whether the package can be loaded ... OK
* checking whether the package can be loaded with stated dependencies ... OK
* checking whether the package can be unloaded cleanly ... OK
* checking whether the namespace can be loaded with stated dependencies ... OK
* checking whether the namespace can be unloaded cleanly ... OK
* checking loading without being on the library search path ... OK
* checking use of S3 registration ... OK
* checking dependencies in R code ... OK
* checking S3 generic/method consistency ... OK
* checking replacement functions ... OK
* checking foreign function calls ... OK
* checking R code for possible problems ... OK
* checking Rd files ... OK
* checking Rd metadata ... OK
* checking Rd line widths ... OK
* checking Rd cross-references ... OK
* checking for missing documentation entries ... OK
* checking for code/documentation mismatches ... OK
* checking Rd \usage sections ... OK
* checking Rd contents ... OK
* checking for unstated dependencies in examples ... OK
* checking installed files from ‘inst/doc’ ... OK
* checking files in ‘vignettes’ ... OK
* checking examples ... OK
* checking for unstated dependencies in ‘tests’ ... OK
* checking tests ... SKIPPED
* checking for unstated dependencies in vignettes ... OK
* checking package vignettes in ‘inst/doc’ ... OK
* checking running R code from vignettes ... SKIPPED
* checking re-building of vignette outputs ... SKIPPED
* DONE

Status: 1 NOTE
See
  ‘/home/jenkins/workspace/ubuntuSparkPRB/R/SparkR.Rcheck/00check.log’
for details.


+ popd
Tests passed.
+ ./build/sbt unsafe/test
Test PASSed.
Refer to this link for build results (access rights to CI server needed): 
https://amplab.cs.berkeley.edu/jenkins//job/ubuntuSparkPRB/139/
Test PASSed.
Finished: SUCCESS