FailedConsole Output

Skipping 15,067 KB.. Full Log
st be installed; however, your version was 0.19.2.'
    test_pandas_udf_decorator (pyspark.sql.tests.test_pandas_udf.PandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_pandas_udf_detect_unsafe_type_conversion (pyspark.sql.tests.test_pandas_udf.PandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_stopiteration_in_udf (pyspark.sql.tests.test_pandas_udf.PandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_udf_wrong_arg (pyspark.sql.tests.test_pandas_udf.PandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'

Skipped tests in pyspark.sql.tests.test_pandas_udf_grouped_agg with python2.7:
    test_alias (pyspark.sql.tests.test_pandas_udf_grouped_agg.GroupedAggPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_array_type (pyspark.sql.tests.test_pandas_udf_grouped_agg.GroupedAggPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_basic (pyspark.sql.tests.test_pandas_udf_grouped_agg.GroupedAggPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_complex_expressions (pyspark.sql.tests.test_pandas_udf_grouped_agg.GroupedAggPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_complex_groupby (pyspark.sql.tests.test_pandas_udf_grouped_agg.GroupedAggPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_grouped_with_empty_partition (pyspark.sql.tests.test_pandas_udf_grouped_agg.GroupedAggPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_invalid_args (pyspark.sql.tests.test_pandas_udf_grouped_agg.GroupedAggPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_manual (pyspark.sql.tests.test_pandas_udf_grouped_agg.GroupedAggPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_mixed_sql (pyspark.sql.tests.test_pandas_udf_grouped_agg.GroupedAggPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_mixed_udfs (pyspark.sql.tests.test_pandas_udf_grouped_agg.GroupedAggPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_multiple_udfs (pyspark.sql.tests.test_pandas_udf_grouped_agg.GroupedAggPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_register_vectorized_udf_basic (pyspark.sql.tests.test_pandas_udf_grouped_agg.GroupedAggPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_retain_group_columns (pyspark.sql.tests.test_pandas_udf_grouped_agg.GroupedAggPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_unsupported_types (pyspark.sql.tests.test_pandas_udf_grouped_agg.GroupedAggPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'

Skipped tests in pyspark.sql.tests.test_pandas_udf_grouped_map with python2.7:
    test_array_type_correct (pyspark.sql.tests.test_pandas_udf_grouped_map.GroupedMapPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_coerce (pyspark.sql.tests.test_pandas_udf_grouped_map.GroupedMapPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_column_order (pyspark.sql.tests.test_pandas_udf_grouped_map.GroupedMapPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_complex_groupby (pyspark.sql.tests.test_pandas_udf_grouped_map.GroupedMapPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_datatype_string (pyspark.sql.tests.test_pandas_udf_grouped_map.GroupedMapPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_decorator (pyspark.sql.tests.test_pandas_udf_grouped_map.GroupedMapPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_empty_groupby (pyspark.sql.tests.test_pandas_udf_grouped_map.GroupedMapPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_grouped_with_empty_partition (pyspark.sql.tests.test_pandas_udf_grouped_map.GroupedMapPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_mixed_scalar_udfs_followed_by_grouby_apply (pyspark.sql.tests.test_pandas_udf_grouped_map.GroupedMapPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_positional_assignment_conf (pyspark.sql.tests.test_pandas_udf_grouped_map.GroupedMapPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_register_grouped_map_udf (pyspark.sql.tests.test_pandas_udf_grouped_map.GroupedMapPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_self_join_with_pandas (pyspark.sql.tests.test_pandas_udf_grouped_map.GroupedMapPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_supported_types (pyspark.sql.tests.test_pandas_udf_grouped_map.GroupedMapPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_timestamp_dst (pyspark.sql.tests.test_pandas_udf_grouped_map.GroupedMapPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_udf_with_key (pyspark.sql.tests.test_pandas_udf_grouped_map.GroupedMapPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_unsupported_types (pyspark.sql.tests.test_pandas_udf_grouped_map.GroupedMapPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_wrong_args (pyspark.sql.tests.test_pandas_udf_grouped_map.GroupedMapPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_wrong_return_type (pyspark.sql.tests.test_pandas_udf_grouped_map.GroupedMapPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'

Skipped tests in pyspark.sql.tests.test_pandas_udf_scalar with python2.7:
    test_datasource_with_udf (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_mixed_udf (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_mixed_udf_and_sql (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_nondeterministic_vectorized_udf (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_nondeterministic_vectorized_udf_in_aggregate (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_pandas_udf_nested_arrays (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_pandas_udf_tokenize (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_register_nondeterministic_vectorized_udf_basic (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_register_vectorized_udf_basic (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_scalar_iter_udf_close (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_scalar_iter_udf_close_early (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_scalar_iter_udf_init (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_timestamp_dst (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_type_annotation (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_array_type (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_basic (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_chained (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_chained_struct_type (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_check_config (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_complex (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_datatype_string (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_dates (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_decorator (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_empty_partition (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_exception (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_invalid_length (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_nested_struct (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_null_array (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_null_binary (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_null_boolean (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_null_byte (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_null_decimal (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_null_double (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_null_float (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_null_int (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_null_long (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_null_short (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_null_string (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_return_scalar (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_return_timestamp_tz (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_string_in_udf (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_struct_complex (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_struct_type (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_struct_with_empty_partition (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_timestamps (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_timestamps_respect_session_timezone (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_unsupported_types (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_varargs (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_vectorized_udf_wrong_return_type (pyspark.sql.tests.test_pandas_udf_scalar.ScalarPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'

Skipped tests in pyspark.sql.tests.test_pandas_udf_window with python2.7:
    test_array_type (pyspark.sql.tests.test_pandas_udf_window.WindowPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_bounded_mixed (pyspark.sql.tests.test_pandas_udf_window.WindowPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_bounded_simple (pyspark.sql.tests.test_pandas_udf_window.WindowPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_growing_window (pyspark.sql.tests.test_pandas_udf_window.WindowPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_invalid_args (pyspark.sql.tests.test_pandas_udf_window.WindowPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_mixed_sql (pyspark.sql.tests.test_pandas_udf_window.WindowPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_mixed_sql_and_udf (pyspark.sql.tests.test_pandas_udf_window.WindowPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_mixed_udf (pyspark.sql.tests.test_pandas_udf_window.WindowPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_multiple_udfs (pyspark.sql.tests.test_pandas_udf_window.WindowPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_replace_existing (pyspark.sql.tests.test_pandas_udf_window.WindowPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_shrinking_window (pyspark.sql.tests.test_pandas_udf_window.WindowPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_simple (pyspark.sql.tests.test_pandas_udf_window.WindowPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_sliding_window (pyspark.sql.tests.test_pandas_udf_window.WindowPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'
    test_without_partitionBy (pyspark.sql.tests.test_pandas_udf_window.WindowPandasUDFTests) ... skipped u'Pandas >= 0.23.2 must be installed; however, your version was 0.19.2.'

Skipped tests in pyspark.sql.tests.test_dataframe with python3.6:
      test_create_dataframe_required_pandas_not_found (pyspark.sql.tests.test_dataframe.DataFrameTests) ... SKIP (0.000s)
      test_to_pandas_required_pandas_not_found (pyspark.sql.tests.test_dataframe.DataFrameTests) ... SKIP (0.000s)

========================================================================
Running PySpark packaging tests
========================================================================
Constructing virtual env for testing
Using conda virtual environments
Testing pip installation with python 3.5
Using /tmp/tmp.US6uXYj7JT for virtualenv
Collecting package metadata: ...working... done
Solving environment: ...working... done

## Package Plan ##

  environment location: /tmp/tmp.US6uXYj7JT/3.5

  added / updated specs:
    - numpy
    - pandas
    - pip
    - python=3.5
    - setuptools


The following NEW packages will be INSTALLED:

  _libgcc_mutex      pkgs/main/linux-64::_libgcc_mutex-0.1-main
  blas               pkgs/main/linux-64::blas-1.0-mkl
  ca-certificates    pkgs/main/linux-64::ca-certificates-2019.5.15-0
  certifi            pkgs/main/linux-64::certifi-2018.8.24-py35_1
  intel-openmp       pkgs/main/linux-64::intel-openmp-2019.4-243
  libedit            pkgs/main/linux-64::libedit-3.1.20181209-hc058e9b_0
  libffi             pkgs/main/linux-64::libffi-3.2.1-hd88cf55_4
  libgcc-ng          pkgs/main/linux-64::libgcc-ng-9.1.0-hdf63c60_0
  libgfortran-ng     pkgs/main/linux-64::libgfortran-ng-7.3.0-hdf63c60_0
  libstdcxx-ng       pkgs/main/linux-64::libstdcxx-ng-9.1.0-hdf63c60_0
  mkl                pkgs/main/linux-64::mkl-2018.0.3-1
  mkl_fft            pkgs/main/linux-64::mkl_fft-1.0.6-py35h7dd41cf_0
  mkl_random         pkgs/main/linux-64::mkl_random-1.0.1-py35h4414c95_1
  ncurses            pkgs/main/linux-64::ncurses-6.1-he6710b0_1
  numpy              pkgs/main/linux-64::numpy-1.15.2-py35h1d66e8a_0
  numpy-base         pkgs/main/linux-64::numpy-base-1.15.2-py35h81de0dd_0
  openssl            pkgs/main/linux-64::openssl-1.0.2s-h7b6447c_0
  pandas             pkgs/main/linux-64::pandas-0.23.4-py35h04863e7_0
  pip                pkgs/main/linux-64::pip-10.0.1-py35_0
  python             pkgs/main/linux-64::python-3.5.6-hc3d631a_0
  python-dateutil    pkgs/main/linux-64::python-dateutil-2.7.3-py35_0
  pytz               pkgs/main/noarch::pytz-2019.1-py_0
  readline           pkgs/main/linux-64::readline-7.0-h7b6447c_5
  setuptools         pkgs/main/linux-64::setuptools-40.2.0-py35_0
  six                pkgs/main/linux-64::six-1.11.0-py35_1
  sqlite             pkgs/main/linux-64::sqlite-3.29.0-h7b6447c_0
  tk                 pkgs/main/linux-64::tk-8.6.8-hbc83047_0
  wheel              pkgs/main/linux-64::wheel-0.31.1-py35_0
  xz                 pkgs/main/linux-64::xz-5.2.4-h14c3975_4
  zlib               pkgs/main/linux-64::zlib-1.2.11-h7b6447c_3


Preparing transaction: ...working... done
Verifying transaction: ...working... done
Executing transaction: ...working... done
#
# To activate this environment, use:
# > conda activate /tmp/tmp.US6uXYj7JT/3.5
#
# To deactivate an active environment, use:
# > conda deactivate
#

Creating pip installable source dist
Could not import pypandoc - required to package PySpark
zip_safe flag not set; analyzing archive contents...
pypandoc.__pycache__.__init__.cpython-35: module references __file__

Installed /home/jenkins/workspace/SparkPullRequestBuilder/python/.eggs/pypandoc-1.4-py3.5.egg
running sdist
running egg_info
creating pyspark.egg-info
writing top-level names to pyspark.egg-info/top_level.txt
writing dependency_links to pyspark.egg-info/dependency_links.txt
writing pyspark.egg-info/PKG-INFO
writing requirements to pyspark.egg-info/requires.txt
writing manifest file 'pyspark.egg-info/SOURCES.txt'
package init file 'deps/bin/__init__.py' not found (or not a regular file)
package init file 'deps/sbin/__init__.py' not found (or not a regular file)
package init file 'deps/jars/__init__.py' not found (or not a regular file)
package init file 'pyspark/python/pyspark/__init__.py' not found (or not a regular file)
package init file 'lib/__init__.py' not found (or not a regular file)
package init file 'deps/data/__init__.py' not found (or not a regular file)
package init file 'deps/licenses/__init__.py' not found (or not a regular file)
package init file 'deps/examples/__init__.py' not found (or not a regular file)
reading manifest file 'pyspark.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no previously-included files matching '*.py[cod]' found anywhere in distribution
warning: no previously-included files matching '__pycache__' found anywhere in distribution
warning: no previously-included files matching '.DS_Store' found anywhere in distribution
writing manifest file 'pyspark.egg-info/SOURCES.txt'
running check
creating pyspark-3.0.0.dev0
creating pyspark-3.0.0.dev0/deps
creating pyspark-3.0.0.dev0/deps/bin
creating pyspark-3.0.0.dev0/deps/data
creating pyspark-3.0.0.dev0/deps/data/graphx
creating pyspark-3.0.0.dev0/deps/data/mllib
creating pyspark-3.0.0.dev0/deps/data/mllib/als
creating pyspark-3.0.0.dev0/deps/data/mllib/images
creating pyspark-3.0.0.dev0/deps/data/mllib/images/origin
creating pyspark-3.0.0.dev0/deps/data/mllib/images/origin/kittens
creating pyspark-3.0.0.dev0/deps/data/mllib/images/partitioned
creating pyspark-3.0.0.dev0/deps/data/mllib/images/partitioned/cls=kittens
creating pyspark-3.0.0.dev0/deps/data/mllib/images/partitioned/cls=kittens/date=2018-01
creating pyspark-3.0.0.dev0/deps/data/mllib/ridge-data
creating pyspark-3.0.0.dev0/deps/data/streaming
creating pyspark-3.0.0.dev0/deps/examples
creating pyspark-3.0.0.dev0/deps/examples/ml
creating pyspark-3.0.0.dev0/deps/examples/mllib
creating pyspark-3.0.0.dev0/deps/examples/sql
creating pyspark-3.0.0.dev0/deps/examples/sql/streaming
creating pyspark-3.0.0.dev0/deps/examples/streaming
creating pyspark-3.0.0.dev0/deps/jars
creating pyspark-3.0.0.dev0/deps/licenses
creating pyspark-3.0.0.dev0/deps/sbin
creating pyspark-3.0.0.dev0/lib
creating pyspark-3.0.0.dev0/pyspark
creating pyspark-3.0.0.dev0/pyspark.egg-info
creating pyspark-3.0.0.dev0/pyspark/ml
creating pyspark-3.0.0.dev0/pyspark/ml/linalg
creating pyspark-3.0.0.dev0/pyspark/ml/param
creating pyspark-3.0.0.dev0/pyspark/mllib
creating pyspark-3.0.0.dev0/pyspark/mllib/linalg
creating pyspark-3.0.0.dev0/pyspark/mllib/stat
creating pyspark-3.0.0.dev0/pyspark/python
creating pyspark-3.0.0.dev0/pyspark/python/pyspark
creating pyspark-3.0.0.dev0/pyspark/sql
creating pyspark-3.0.0.dev0/pyspark/streaming
copying files to pyspark-3.0.0.dev0...
copying MANIFEST.in -> pyspark-3.0.0.dev0
copying README.md -> pyspark-3.0.0.dev0
copying setup.cfg -> pyspark-3.0.0.dev0
copying setup.py -> pyspark-3.0.0.dev0
copying deps/bin/beeline -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/beeline.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/docker-image-tool.sh -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/find-spark-home -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/find-spark-home.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/load-spark-env.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/load-spark-env.sh -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/pyspark -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/pyspark.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/pyspark2.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/run-example -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/run-example.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-class -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-class.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-class2.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-shell -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-shell.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-shell2.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-sql -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-sql.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-sql2.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-submit -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-submit.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-submit2.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/sparkR -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/sparkR.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/sparkR2.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/data/graphx/followers.txt -> pyspark-3.0.0.dev0/deps/data/graphx
copying deps/data/graphx/users.txt -> pyspark-3.0.0.dev0/deps/data/graphx
copying deps/data/mllib/gmm_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/iris_libsvm.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/kmeans_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/pagerank_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/pic_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_binary_classification_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_fpgrowth.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_isotonic_regression_libsvm_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_kmeans_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_lda_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_lda_libsvm_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_libsvm_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_linear_regression_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_movielens_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_multiclass_classification_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_svm_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/streaming_kmeans_data_test.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/als/sample_movielens_ratings.txt -> pyspark-3.0.0.dev0/deps/data/mllib/als
copying deps/data/mllib/als/test.data -> pyspark-3.0.0.dev0/deps/data/mllib/als
copying deps/data/mllib/images/license.txt -> pyspark-3.0.0.dev0/deps/data/mllib/images
copying deps/data/mllib/images/origin/license.txt -> pyspark-3.0.0.dev0/deps/data/mllib/images/origin
copying deps/data/mllib/images/origin/kittens/not-image.txt -> pyspark-3.0.0.dev0/deps/data/mllib/images/origin/kittens
copying deps/data/mllib/images/partitioned/cls=kittens/date=2018-01/not-image.txt -> pyspark-3.0.0.dev0/deps/data/mllib/images/partitioned/cls=kittens/date=2018-01
copying deps/data/mllib/ridge-data/lpsa.data -> pyspark-3.0.0.dev0/deps/data/mllib/ridge-data
copying deps/data/streaming/AFINN-111.txt -> pyspark-3.0.0.dev0/deps/data/streaming
copying deps/examples/als.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/avro_inputformat.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/kmeans.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/logistic_regression.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/pagerank.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/parquet_inputformat.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/pi.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/sort.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/status_api_demo.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/transitive_closure.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/wordcount.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/ml/aft_survival_regression.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/als_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/binarizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/bisecting_k_means_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/bucketed_random_projection_lsh_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/bucketizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/chi_square_test_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/chisq_selector_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/correlation_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/count_vectorizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/cross_validator.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/dataframe_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/dct_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/decision_tree_classification_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/decision_tree_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/elementwise_product_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/estimator_transformer_param_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/feature_hasher_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/fpgrowth_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/gaussian_mixture_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/generalized_linear_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/gradient_boosted_tree_classifier_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/gradient_boosted_tree_regressor_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/imputer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/index_to_string_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/interaction_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/isotonic_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/kmeans_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/lda_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/linear_regression_with_elastic_net.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/linearsvc.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/logistic_regression_summary_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/logistic_regression_with_elastic_net.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/max_abs_scaler_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/min_hash_lsh_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/min_max_scaler_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/multiclass_logistic_regression_with_elastic_net.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/multilayer_perceptron_classification.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/n_gram_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/naive_bayes_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/normalizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/one_vs_rest_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/onehot_encoder_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/pca_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/pipeline_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/polynomial_expansion_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/power_iteration_clustering_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/prefixspan_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/quantile_discretizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/random_forest_classifier_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/random_forest_regressor_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/rformula_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/sql_transformer.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/standard_scaler_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/stopwords_remover_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/string_indexer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/summarizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/tf_idf_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/tokenizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/train_validation_split.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_assembler_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_indexer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_size_hint_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_slicer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/word2vec_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/mllib/binary_classification_metrics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/bisecting_k_means_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/correlations.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/correlations_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/decision_tree_classification_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/decision_tree_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/elementwise_product_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/fpgrowth_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gaussian_mixture_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gaussian_mixture_model.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gradient_boosting_classification_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gradient_boosting_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/hypothesis_testing_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/hypothesis_testing_kolmogorov_smirnov_test_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/isotonic_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/k_means_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/kernel_density_estimation_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/kmeans.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/latent_dirichlet_allocation_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/linear_regression_with_sgd_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/logistic_regression.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/logistic_regression_with_lbfgs_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/multi_class_metrics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/multi_label_metrics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/naive_bayes_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/normalizer_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/pca_rowmatrix_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/power_iteration_clustering_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/random_forest_classification_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/random_forest_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/random_rdd_generation.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/ranking_metrics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/recommendation_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/regression_metrics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/sampled_rdds.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/standard_scaler_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/stratified_sampling_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/streaming_k_means_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/streaming_linear_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/summary_statistics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/svd_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/svm_with_sgd_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/tf_idf_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/word2vec.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/word2vec_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/sql/arrow.py -> pyspark-3.0.0.dev0/deps/examples/sql
copying deps/examples/sql/basic.py -> pyspark-3.0.0.dev0/deps/examples/sql
copying deps/examples/sql/datasource.py -> pyspark-3.0.0.dev0/deps/examples/sql
copying deps/examples/sql/hive.py -> pyspark-3.0.0.dev0/deps/examples/sql
copying deps/examples/sql/streaming/structured_kafka_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/sql/streaming
copying deps/examples/sql/streaming/structured_network_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/sql/streaming
copying deps/examples/sql/streaming/structured_network_wordcount_windowed.py -> pyspark-3.0.0.dev0/deps/examples/sql/streaming
copying deps/examples/streaming/hdfs_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/network_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/network_wordjoinsentiments.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/queue_stream.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/recoverable_network_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/sql_network_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/stateful_network_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/jars/JavaEWAH-0.3.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/RoaringBitmap-0.7.45.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/ST4-4.0.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/activation-1.1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/aircompressor-0.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/antlr-2.7.7.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/antlr-runtime-3.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/antlr4-runtime-4.7.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/aopalliance-1.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/aopalliance-repackaged-2.4.0-b34.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/apache-log4j-extras-1.2.17.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/apacheds-i18n-2.0.0-M15.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/apacheds-kerberos-codec-2.0.0-M15.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/api-asn1-api-1.0.0-M20.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/api-util-1.0.0-M20.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/arpack_combined_all-0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/arrow-format-0.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/arrow-memory-0.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/arrow-vector-0.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/automaton-1.11-8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/avro-1.8.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/avro-ipc-1.8.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/avro-mapred-1.8.2-hadoop2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/aws-java-sdk-1.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/azure-storage-2.0.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/bonecp-0.8.0.RELEASE.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/breeze-macros_2.12-0.13.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/breeze_2.12-0.13.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/cglib-2.2.1-v20090111.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/chill-java-0.9.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/chill_2.12-0.9.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-beanutils-1.7.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-cli-1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-codec-1.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-collections-3.2.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-compiler-3.0.13.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-compress-1.8.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-configuration-1.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-crypto-1.0.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-dbcp-1.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-digester-1.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-httpclient-3.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-io-2.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-lang-2.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-lang3-3.8.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-logging-1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-math3-3.4.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-net-3.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-pool-1.5.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/compress-lzf-1.0.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/core-1.1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/curator-client-2.7.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/curator-framework-2.7.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/curator-recipes-2.7.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/datanucleus-api-jdo-3.2.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/datanucleus-core-3.2.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/datanucleus-rdbms-3.2.9.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/derby-10.12.1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/flatbuffers-java-1.9.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/generex-1.0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/gmetric4j-1.0.7.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/gson-2.2.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/guava-14.0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/guice-3.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-annotations-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-auth-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-aws-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-azure-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-client-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-common-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-hdfs-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-app-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-common-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-core-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-jobclient-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-shuffle-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-openstack-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-api-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-client-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-common-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-server-common-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-server-web-proxy-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hive-beeline-1.2.1.spark2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hive-cli-1.2.1.spark2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hive-exec-1.2.1.spark2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hive-jdbc-1.2.1.spark2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hive-metastore-1.2.1.spark2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hk2-api-2.4.0-b34.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hk2-locator-2.4.0-b34.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hk2-utils-2.4.0-b34.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hppc-0.7.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/htrace-core-3.1.0-incubating.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/httpclient-4.5.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/httpcore-4.4.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/ivy-2.4.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-annotations-2.9.9.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-core-2.9.9.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-core-asl-1.9.13.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-databind-2.9.9.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-dataformat-cbor-2.9.9.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-dataformat-yaml-2.9.9.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-jaxrs-1.9.13.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-mapper-asl-1.9.13.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-module-jaxb-annotations-2.9.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-module-paranamer-2.9.9.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-module-scala_2.12-2.9.9.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-xc-1.9.13.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/janino-3.0.13.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javassist-3.18.1-GA.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javax.annotation-api-1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javax.inject-1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javax.inject-2.4.0-b34.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javax.servlet-api-3.1.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javax.ws.rs-api-2.0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javolution-5.5.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jaxb-api-2.2.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jcl-over-slf4j-1.7.16.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jdo-api-3.0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-client-2.22.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-common-2.22.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-container-servlet-2.22.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-container-servlet-core-2.22.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-guava-2.22.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-media-jaxb-2.22.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-server-2.22.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jettison-1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-6.1.26.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-client-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-continuation-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-http-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-io-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-jndi-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-plus-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-proxy-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-security-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-server-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-servlet-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-servlets-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-sslengine-6.1.26.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-util-6.1.26.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-util-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-webapp-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-xml-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jline-2.14.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/joda-time-2.9.9.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jodd-core-3.5.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jpam-1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/json4s-ast_2.12-3.6.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/json4s-core_2.12-3.6.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/json4s-jackson_2.12-3.6.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/json4s-scalap_2.12-3.6.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jsp-api-2.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jsr305-3.0.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jta-1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jtransforms-2.4.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jul-to-slf4j-1.7.16.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/kryo-shaded-4.0.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/kubernetes-client-4.1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/kubernetes-model-4.1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/kubernetes-model-common-4.1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/leveldbjni-all-1.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/libfb303-0.9.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/libthrift-0.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/log4j-1.2.17.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/logging-interceptor-3.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/lz4-java-1.6.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/machinist_2.12-0.6.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/macro-compat_2.12-1.1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/mesos-1.4.0-shaded-protobuf.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/metrics-core-3.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/metrics-ganglia-3.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/metrics-graphite-3.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/metrics-json-3.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/metrics-jvm-3.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/minlog-1.3.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/netty-3.9.9.Final.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/netty-all-4.1.30.Final.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/objenesis-2.5.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/okapi-shade-0.4.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/okhttp-3.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/okio-1.15.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/oncrpc-1.0.7.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/opencsv-2.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/orc-core-1.5.5-nohive.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/orc-mapreduce-1.5.5-nohive.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/orc-shims-1.5.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/oro-2.0.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/osgi-resource-locator-1.0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/paranamer-2.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-column-1.10.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-common-1.10.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-encoding-1.10.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-format-2.4.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-hadoop-1.10.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-hadoop-bundle-1.6.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-jackson-1.10.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/pmml-model-1.4.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/protobuf-java-2.5.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/py4j-0.10.8.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/pyrolite-4.30.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-compiler-2.12.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-library-2.12.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-parser-combinators_2.12-1.1.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-reflect-2.12.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-xml_2.12-1.2.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/shapeless_2.12-2.3.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/shims-0.7.45.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/slf4j-api-1.7.25.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/slf4j-log4j12-1.7.16.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/snakeyaml-1.23.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/snappy-0.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/snappy-java-1.1.7.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-assembly_2.12-3.0.0-SNAPSHOT-tests.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-assembly_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-catalyst_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-core_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-cypher_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-ganglia-lgpl_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-graph-api_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-graph_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-graphx_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-hadoop-cloud_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-hive-thriftserver_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-hive_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-kubernetes_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-kvstore_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-launcher_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-mesos_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-mllib-local_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-mllib_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-network-common_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-network-shuffle_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-repl_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-sketch_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-sql_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-streaming_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-tags_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-unsafe_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-yarn_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spire-macros_2.12-0.13.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spire_2.12-0.13.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/stax-api-1.0-2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/stax-api-1.0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/stream-2.9.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/stringtemplate-3.2.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/super-csv-2.2.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/univocity-parsers-2.7.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/unused-1.0.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/validation-api-1.1.0.Final.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/xbean-asm7-shaded-4.14.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/xercesImpl-2.9.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/xml-apis-1.3.04.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/xmlenc-0.52.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/xz-1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/zjsonpatch-0.3.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/zookeeper-3.4.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/zstd-jni-1.4.0-1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/licenses/LICENSE-AnchorJS.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-CC0.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-bootstrap.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-cloudpickle.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-copybutton.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-d3.min.js.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-dagre-d3.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-datatables.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-graphlib-dot.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-heapq.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-join.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-jquery.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-json-formatter.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-matchMedia-polyfill.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-modernizr.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-mustache.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-py4j.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-respond.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-sbt-launch-lib.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-sorttable.js.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-vis.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/sbin/spark-config.sh -> pyspark-3.0.0.dev0/deps/sbin
copying deps/sbin/spark-daemon.sh -> pyspark-3.0.0.dev0/deps/sbin
copying deps/sbin/start-history-server.sh -> pyspark-3.0.0.dev0/deps/sbin
copying deps/sbin/stop-history-server.sh -> pyspark-3.0.0.dev0/deps/sbin
copying lib/py4j-0.10.8.1-src.zip -> pyspark-3.0.0.dev0/lib
copying lib/pyspark.zip -> pyspark-3.0.0.dev0/lib
copying pyspark/__init__.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/_globals.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/accumulators.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/broadcast.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/cloudpickle.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/conf.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/context.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/daemon.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/files.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/find_spark_home.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/heapq3.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/java_gateway.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/join.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/profiler.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/rdd.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/rddsampler.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/resourceinformation.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/resultiterable.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/serializers.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/shell.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/shuffle.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/statcounter.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/status.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/storagelevel.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/taskcontext.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/traceback_utils.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/util.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/version.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/worker.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark.egg-info/PKG-INFO -> pyspark-3.0.0.dev0/pyspark.egg-info
copying pyspark.egg-info/SOURCES.txt -> pyspark-3.0.0.dev0/pyspark.egg-info
copying pyspark.egg-info/dependency_links.txt -> pyspark-3.0.0.dev0/pyspark.egg-info
copying pyspark.egg-info/requires.txt -> pyspark-3.0.0.dev0/pyspark.egg-info
copying pyspark.egg-info/top_level.txt -> pyspark-3.0.0.dev0/pyspark.egg-info
copying pyspark/ml/__init__.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/base.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/classification.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/clustering.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/common.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/evaluation.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/feature.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/fpm.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/image.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/pipeline.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/recommendation.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/regression.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/stat.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/tuning.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/util.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/wrapper.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/linalg/__init__.py -> pyspark-3.0.0.dev0/pyspark/ml/linalg
copying pyspark/ml/param/__init__.py -> pyspark-3.0.0.dev0/pyspark/ml/param
copying pyspark/ml/param/_shared_params_code_gen.py -> pyspark-3.0.0.dev0/pyspark/ml/param
copying pyspark/ml/param/shared.py -> pyspark-3.0.0.dev0/pyspark/ml/param
copying pyspark/mllib/__init__.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/classification.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/clustering.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/common.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/evaluation.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/feature.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/fpm.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/random.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/recommendation.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/regression.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/tree.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/util.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/linalg/__init__.py -> pyspark-3.0.0.dev0/pyspark/mllib/linalg
copying pyspark/mllib/linalg/distributed.py -> pyspark-3.0.0.dev0/pyspark/mllib/linalg
copying pyspark/mllib/stat/KernelDensity.py -> pyspark-3.0.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/__init__.py -> pyspark-3.0.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/_statistics.py -> pyspark-3.0.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/distribution.py -> pyspark-3.0.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/test.py -> pyspark-3.0.0.dev0/pyspark/mllib/stat
copying pyspark/python/pyspark/shell.py -> pyspark-3.0.0.dev0/pyspark/python/pyspark
copying pyspark/sql/__init__.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/catalog.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/column.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/conf.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/context.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/dataframe.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/functions.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/group.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/readwriter.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/session.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/streaming.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/types.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/udf.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/utils.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/window.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/streaming/__init__.py -> pyspark-3.0.0.dev0/pyspark/streaming
copying pyspark/streaming/context.py -> pyspark-3.0.0.dev0/pyspark/streaming
copying pyspark/streaming/dstream.py -> pyspark-3.0.0.dev0/pyspark/streaming
copying pyspark/streaming/kinesis.py -> pyspark-3.0.0.dev0/pyspark/streaming
copying pyspark/streaming/listener.py -> pyspark-3.0.0.dev0/pyspark/streaming
copying pyspark/streaming/util.py -> pyspark-3.0.0.dev0/pyspark/streaming
Writing pyspark-3.0.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'pyspark-3.0.0.dev0' (and everything under it)
Installing dist into virtual env
Processing ./python/dist/pyspark-3.0.0.dev0.tar.gz
Collecting py4j==0.10.8.1 (from pyspark==3.0.0.dev0)
  Downloading https://files.pythonhosted.org/packages/04/de/2d314a921ef4c20b283e1de94e0780273678caac901564df06b948e4ba9b/py4j-0.10.8.1-py2.py3-none-any.whl (196kB)
mkl-random 1.0.1 requires cython, which is not installed.
Installing collected packages: py4j, pyspark
  Running setup.py install for pyspark: started
    Running setup.py install for pyspark: finished with status 'done'
Successfully installed py4j-0.10.8.1 pyspark-3.0.0.dev0
You are using pip version 10.0.1, however version 19.1.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
Run basic sanity check on pip installed version with spark-submit
19/07/21 23:58:53 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
19/07/21 23:58:55 INFO SparkContext: Running Spark version 3.0.0-SNAPSHOT
19/07/21 23:58:55 INFO ResourceUtils: ==============================================================
19/07/21 23:58:55 INFO ResourceUtils: Resources for spark.driver:

19/07/21 23:58:55 INFO ResourceUtils: ==============================================================
19/07/21 23:58:55 INFO SparkContext: Submitted application: PipSanityCheck
19/07/21 23:58:55 INFO SecurityManager: Changing view acls to: jenkins
19/07/21 23:58:55 INFO SecurityManager: Changing modify acls to: jenkins
19/07/21 23:58:55 INFO SecurityManager: Changing view acls groups to: 
19/07/21 23:58:55 INFO SecurityManager: Changing modify acls groups to: 
19/07/21 23:58:55 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
19/07/21 23:58:55 INFO Utils: Successfully started service 'sparkDriver' on port 42366.
19/07/21 23:58:55 INFO SparkEnv: Registering MapOutputTracker
19/07/21 23:58:55 INFO SparkEnv: Registering BlockManagerMaster
19/07/21 23:58:55 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
19/07/21 23:58:55 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
19/07/21 23:58:56 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-27102ab3-684c-4328-966b-cdd269990fdb
19/07/21 23:58:56 INFO MemoryStore: MemoryStore started with capacity 366.3 MiB
19/07/21 23:58:56 INFO SparkEnv: Registering OutputCommitCoordinator
19/07/21 23:58:56 INFO log: Logging initialized @4834ms to org.eclipse.jetty.util.log.Slf4jLog
19/07/21 23:58:56 INFO Server: jetty-9.4.18.v20190429; built: 2019-04-29T20:42:08.989Z; git: e1bc35120a6617ee3df052294e433f3a25ce7097; jvm 1.8.0_191-b12
19/07/21 23:58:56 INFO Server: Started @4964ms
19/07/21 23:58:56 INFO AbstractConnector: Started ServerConnector@39026e0c{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
19/07/21 23:58:56 INFO Utils: Successfully started service 'SparkUI' on port 4040.
19/07/21 23:58:56 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@1611cb6f{/jobs,null,AVAILABLE,@Spark}
19/07/21 23:58:56 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@3364c137{/jobs/json,null,AVAILABLE,@Spark}
19/07/21 23:58:56 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@1f58f4be{/jobs/job,null,AVAILABLE,@Spark}
19/07/21 23:58:56 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@27973376{/jobs/job/json,null,AVAILABLE,@Spark}
19/07/21 23:58:56 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@44882f3{/stages,null,AVAILABLE,@Spark}
19/07/21 23:58:56 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@3444d3f3{/stages/json,null,AVAILABLE,@Spark}
19/07/21 23:58:56 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@1df1f62f{/stages/stage,null,AVAILABLE,@Spark}
19/07/21 23:58:56 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@2deb56d8{/stages/stage/json,null,AVAILABLE,@Spark}
19/07/21 23:58:56 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@737c2a05{/stages/pool,null,AVAILABLE,@Spark}
19/07/21 23:58:56 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@77e055dc{/stages/pool/json,null,AVAILABLE,@Spark}
19/07/21 23:58:56 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@75532976{/storage,null,AVAILABLE,@Spark}
19/07/21 23:58:56 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5482a04f{/storage/json,null,AVAILABLE,@Spark}
19/07/21 23:58:56 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@3748fab0{/storage/rdd,null,AVAILABLE,@Spark}
19/07/21 23:58:56 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@1d232994{/storage/rdd/json,null,AVAILABLE,@Spark}
19/07/21 23:58:56 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@55cbd4d6{/environment,null,AVAILABLE,@Spark}
19/07/21 23:58:56 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@1e590f29{/environment/json,null,AVAILABLE,@Spark}
19/07/21 23:58:56 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@1b06845{/executors,null,AVAILABLE,@Spark}
19/07/21 23:58:56 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5a2a850e{/executors/json,null,AVAILABLE,@Spark}
19/07/21 23:58:56 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@6130fb83{/executors/threadDump,null,AVAILABLE,@Spark}
19/07/21 23:58:56 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@494e9dd{/executors/threadDump/json,null,AVAILABLE,@Spark}
19/07/21 23:58:56 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@310b047{/static,null,AVAILABLE,@Spark}
19/07/21 23:58:56 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@55339c67{/,null,AVAILABLE,@Spark}
19/07/21 23:58:56 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@59506791{/api,null,AVAILABLE,@Spark}
19/07/21 23:58:56 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@429e7840{/jobs/job/kill,null,AVAILABLE,@Spark}
19/07/21 23:58:56 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@1d751ab{/stages/stage/kill,null,AVAILABLE,@Spark}
19/07/21 23:58:56 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://amp-jenkins-worker-05.amp:4040
19/07/21 23:58:56 INFO Executor: Starting executor ID driver on host localhost
19/07/21 23:58:56 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 42819.
19/07/21 23:58:56 INFO NettyBlockTransferService: Server created on amp-jenkins-worker-05.amp:42819
19/07/21 23:58:56 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
19/07/21 23:58:56 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, amp-jenkins-worker-05.amp, 42819, None)
19/07/21 23:58:56 INFO BlockManagerMasterEndpoint: Registering block manager amp-jenkins-worker-05.amp:42819 with 366.3 MiB RAM, BlockManagerId(driver, amp-jenkins-worker-05.amp, 42819, None)
19/07/21 23:58:56 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, amp-jenkins-worker-05.amp, 42819, None)
19/07/21 23:58:56 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, amp-jenkins-worker-05.amp, 42819, None)
19/07/21 23:58:57 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@a9b425c{/metrics/json,null,AVAILABLE,@Spark}
19/07/21 23:58:57 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/spark-warehouse').
19/07/21 23:58:57 INFO SharedState: Warehouse path is 'file:/spark-warehouse'.
19/07/21 23:58:57 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@9983d34{/SQL,null,AVAILABLE,@Spark}
19/07/21 23:58:57 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5addf9cd{/SQL/json,null,AVAILABLE,@Spark}
19/07/21 23:58:57 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@93fe036{/SQL/execution,null,AVAILABLE,@Spark}
19/07/21 23:58:57 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@51474e11{/SQL/execution/json,null,AVAILABLE,@Spark}
19/07/21 23:58:57 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@7e506547{/static/sql,null,AVAILABLE,@Spark}
19/07/21 23:58:58 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
19/07/21 23:58:59 INFO SparkContext: Starting job: reduce at /home/jenkins/workspace/SparkPullRequestBuilder/dev/pip-sanity-check.py:31
19/07/21 23:58:59 INFO DAGScheduler: Got job 0 (reduce at /home/jenkins/workspace/SparkPullRequestBuilder/dev/pip-sanity-check.py:31) with 10 output partitions
19/07/21 23:58:59 INFO DAGScheduler: Final stage: ResultStage 0 (reduce at /home/jenkins/workspace/SparkPullRequestBuilder/dev/pip-sanity-check.py:31)
19/07/21 23:58:59 INFO DAGScheduler: Parents of final stage: List()
19/07/21 23:58:59 INFO DAGScheduler: Missing parents: List()
19/07/21 23:58:59 INFO DAGScheduler: Submitting ResultStage 0 (PythonRDD[1] at reduce at /home/jenkins/workspace/SparkPullRequestBuilder/dev/pip-sanity-check.py:31), which has no missing parents
19/07/21 23:58:59 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 5.9 KiB, free 366.3 MiB)
19/07/21 23:58:59 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 366.3 MiB)
19/07/21 23:58:59 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on amp-jenkins-worker-05.amp:42819 (size: 3.8 KiB, free: 366.3 MiB)
19/07/21 23:58:59 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1180
19/07/21 23:58:59 INFO DAGScheduler: Submitting 10 missing tasks from ResultStage 0 (PythonRDD[1] at reduce at /home/jenkins/workspace/SparkPullRequestBuilder/dev/pip-sanity-check.py:31) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7, 8, 9))
19/07/21 23:58:59 INFO TaskSchedulerImpl: Adding task set 0.0 with 10 tasks
19/07/21 23:58:59 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 7333 bytes)
19/07/21 23:58:59 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, localhost, executor driver, partition 1, PROCESS_LOCAL, 7333 bytes)
19/07/21 23:58:59 INFO TaskSetManager: Starting task 2.0 in stage 0.0 (TID 2, localhost, executor driver, partition 2, PROCESS_LOCAL, 7333 bytes)
19/07/21 23:58:59 INFO TaskSetManager: Starting task 3.0 in stage 0.0 (TID 3, localhost, executor driver, partition 3, PROCESS_LOCAL, 7333 bytes)
19/07/21 23:58:59 INFO TaskSetManager: Starting task 4.0 in stage 0.0 (TID 4, localhost, executor driver, partition 4, PROCESS_LOCAL, 7333 bytes)
19/07/21 23:58:59 INFO TaskSetManager: Starting task 5.0 in stage 0.0 (TID 5, localhost, executor driver, partition 5, PROCESS_LOCAL, 7333 bytes)
19/07/21 23:58:59 INFO TaskSetManager: Starting task 6.0 in stage 0.0 (TID 6, localhost, executor driver, partition 6, PROCESS_LOCAL, 7333 bytes)
19/07/21 23:58:59 INFO TaskSetManager: Starting task 7.0 in stage 0.0 (TID 7, localhost, executor driver, partition 7, PROCESS_LOCAL, 7333 bytes)
19/07/21 23:58:59 INFO TaskSetManager: Starting task 8.0 in stage 0.0 (TID 8, localhost, executor driver, partition 8, PROCESS_LOCAL, 7333 bytes)
19/07/21 23:58:59 INFO TaskSetManager: Starting task 9.0 in stage 0.0 (TID 9, localhost, executor driver, partition 9, PROCESS_LOCAL, 7333 bytes)
19/07/21 23:58:59 INFO Executor: Running task 1.0 in stage 0.0 (TID 1)
19/07/21 23:58:59 INFO Executor: Running task 4.0 in stage 0.0 (TID 4)
19/07/21 23:58:59 INFO Executor: Running task 7.0 in stage 0.0 (TID 7)
19/07/21 23:58:59 INFO Executor: Running task 2.0 in stage 0.0 (TID 2)
19/07/21 23:58:59 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
19/07/21 23:58:59 INFO Executor: Running task 6.0 in stage 0.0 (TID 6)
19/07/21 23:58:59 INFO Executor: Running task 5.0 in stage 0.0 (TID 5)
19/07/21 23:58:59 INFO Executor: Running task 3.0 in stage 0.0 (TID 3)
19/07/21 23:58:59 INFO Executor: Running task 9.0 in stage 0.0 (TID 9)
19/07/21 23:58:59 INFO Executor: Running task 8.0 in stage 0.0 (TID 8)
19/07/21 23:59:01 INFO PythonRunner: Times: total = 1403, boot = 1339, init = 63, finish = 1
19/07/21 23:59:01 INFO PythonRunner: Times: total = 1402, boot = 1358, init = 44, finish = 0
19/07/21 23:59:01 INFO PythonRunner: Times: total = 1408, boot = 1366, init = 42, finish = 0
19/07/21 23:59:01 INFO PythonRunner: Times: total = 1403, boot = 1354, init = 48, finish = 1
19/07/21 23:59:01 INFO PythonRunner: Times: total = 1405, boot = 1362, init = 43, finish = 0
19/07/21 23:59:01 INFO PythonRunner: Times: total = 1404, boot = 1349, init = 55, finish = 0
19/07/21 23:59:01 INFO PythonRunner: Times: total = 1402, boot = 1344, init = 58, finish = 0
19/07/21 23:59:01 INFO PythonRunner: Times: total = 1417, boot = 1374, init = 43, finish = 0
19/07/21 23:59:01 INFO PythonRunner: Times: total = 1414, boot = 1370, init = 44, finish = 0
19/07/21 23:59:01 INFO PythonRunner: Times: total = 1421, boot = 1379, init = 42, finish = 0
19/07/21 23:59:01 INFO Executor: Finished task 3.0 in stage 0.0 (TID 3). 1384 bytes result sent to driver
19/07/21 23:59:01 INFO Executor: Finished task 1.0 in stage 0.0 (TID 1). 1383 bytes result sent to driver
19/07/21 23:59:01 INFO Executor: Finished task 5.0 in stage 0.0 (TID 5). 1384 bytes result sent to driver
19/07/21 23:59:01 INFO Executor: Finished task 8.0 in stage 0.0 (TID 8). 1384 bytes result sent to driver
19/07/21 23:59:01 INFO Executor: Finished task 9.0 in stage 0.0 (TID 9). 1384 bytes result sent to driver
19/07/21 23:59:01 INFO Executor: Finished task 7.0 in stage 0.0 (TID 7). 1384 bytes result sent to driver
19/07/21 23:59:01 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 1383 bytes result sent to driver
19/07/21 23:59:01 INFO Executor: Finished task 6.0 in stage 0.0 (TID 6). 1384 bytes result sent to driver
19/07/21 23:59:01 INFO Executor: Finished task 2.0 in stage 0.0 (TID 2). 1383 bytes result sent to driver
19/07/21 23:59:01 INFO Executor: Finished task 4.0 in stage 0.0 (TID 4). 1384 bytes result sent to driver
19/07/21 23:59:01 INFO TaskSetManager: Finished task 3.0 in stage 0.0 (TID 3) in 2155 ms on localhost (executor driver) (1/10)
19/07/21 23:59:01 INFO TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 2161 ms on localhost (executor driver) (2/10)
19/07/21 23:59:01 INFO TaskSetManager: Finished task 5.0 in stage 0.0 (TID 5) in 2158 ms on localhost (executor driver) (3/10)
19/07/21 23:59:01 INFO TaskSetManager: Finished task 8.0 in stage 0.0 (TID 8) in 2156 ms on localhost (executor driver) (4/10)
19/07/21 23:59:01 INFO TaskSetManager: Finished task 9.0 in stage 0.0 (TID 9) in 2157 ms on localhost (executor driver) (5/10)
19/07/21 23:59:01 INFO TaskSetManager: Finished task 7.0 in stage 0.0 (TID 7) in 2160 ms on localhost (executor driver) (6/10)
19/07/21 23:59:01 INFO TaskSetManager: Finished task 6.0 in stage 0.0 (TID 6) in 2165 ms on localhost (executor driver) (7/10)
19/07/21 23:59:01 INFO TaskSetManager: Finished task 2.0 in stage 0.0 (TID 2) in 2169 ms on localhost (executor driver) (8/10)
19/07/21 23:59:01 INFO TaskSetManager: Finished task 4.0 in stage 0.0 (TID 4) in 2169 ms on localhost (executor driver) (9/10)
19/07/21 23:59:01 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 2232 ms on localhost (executor driver) (10/10)
19/07/21 23:59:01 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
19/07/21 23:59:01 INFO PythonAccumulatorV2: Connected to AccumulatorServer at host: 127.0.0.1 port: 39375
19/07/21 23:59:01 INFO DAGScheduler: ResultStage 0 (reduce at /home/jenkins/workspace/SparkPullRequestBuilder/dev/pip-sanity-check.py:31) finished in 2.485 s
19/07/21 23:59:01 INFO DAGScheduler: Job 0 is finished. Cancelling potential speculative or zombie tasks for this job
19/07/21 23:59:01 INFO TaskSchedulerImpl: Killing all running tasks in stage 0: Stage finished
19/07/21 23:59:01 INFO DAGScheduler: Job 0 finished: reduce at /home/jenkins/workspace/SparkPullRequestBuilder/dev/pip-sanity-check.py:31, took 2.553415 s
Successfully ran pip sanity check
19/07/21 23:59:01 INFO AbstractConnector: Stopped Spark@39026e0c{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
19/07/21 23:59:01 INFO SparkUI: Stopped Spark web UI at http://amp-jenkins-worker-05.amp:4040
19/07/21 23:59:01 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/07/21 23:59:01 INFO MemoryStore: MemoryStore cleared
19/07/21 23:59:01 INFO BlockManager: BlockManager stopped
19/07/21 23:59:01 INFO BlockManagerMaster: BlockManagerMaster stopped
19/07/21 23:59:01 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/07/21 23:59:01 INFO SparkContext: Successfully stopped SparkContext
19/07/21 23:59:02 INFO ShutdownHookManager: Shutdown hook called
19/07/21 23:59:02 INFO ShutdownHookManager: Deleting directory /tmp/spark-fa0763a3-171c-424e-bacf-9be4160951da
19/07/21 23:59:02 INFO ShutdownHookManager: Deleting directory /tmp/spark-7325b144-2082-4229-9e64-9fe4cd68a780
19/07/21 23:59:02 INFO ShutdownHookManager: Deleting directory /tmp/spark-7325b144-2082-4229-9e64-9fe4cd68a780/pyspark-24a321ca-18fb-4837-81db-91228ee47206
Run basic sanity check with import based
19/07/21 23:59:07 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

[Stage 0:>                                                        (0 + 10) / 10]
[Stage 0:=====>                                                    (1 + 9) / 10]
                                                                                
Successfully ran pip sanity check
Run the tests for context.py
19/07/21 23:59:17 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
19/07/21 23:59:21 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

[Stage 0:>                                                          (0 + 4) / 4]
                                                                                

[Stage 4:>                                                          (0 + 0) / 4]
                                                                                

[Stage 10:>                                                         (0 + 4) / 4]
[Stage 10:>                 (0 + 4) / 4][Stage 11:>                 (0 + 0) / 2]19/07/21 23:59:36 WARN PythonRunner: Incomplete task 3.0 in stage 10 (TID 42) interrupted: Attempting to kill Python Worker
19/07/21 23:59:36 WARN PythonRunner: Incomplete task 1.0 in stage 10 (TID 40) interrupted: Attempting to kill Python Worker
19/07/21 23:59:36 WARN PythonRunner: Incomplete task 0.0 in stage 10 (TID 39) interrupted: Attempting to kill Python Worker
19/07/21 23:59:36 WARN PythonRunner: Incomplete task 2.0 in stage 10 (TID 41) interrupted: Attempting to kill Python Worker
19/07/21 23:59:36 WARN TaskSetManager: Lost task 0.0 in stage 10.0 (TID 39, localhost, executor driver): TaskKilled (Stage cancelled)
19/07/21 23:59:36 WARN TaskSetManager: Lost task 1.0 in stage 10.0 (TID 40, localhost, executor driver): TaskKilled (Stage cancelled)
19/07/21 23:59:36 WARN TaskSetManager: Lost task 3.0 in stage 10.0 (TID 42, localhost, executor driver): TaskKilled (Stage cancelled)
19/07/21 23:59:36 WARN TaskSetManager: Lost task 2.0 in stage 10.0 (TID 41, localhost, executor driver): TaskKilled (Stage cancelled)

                                                                                
DeprecationWarning: 'source deactivate' is deprecated. Use 'conda deactivate'.
Testing pip installation with python 3.5
Using /tmp/tmp.US6uXYj7JT for virtualenv
Collecting package metadata: ...working... done
Solving environment: ...working... done

## Package Plan ##

  environment location: /tmp/tmp.US6uXYj7JT/3.5

  added / updated specs:
    - numpy
    - pandas
    - pip
    - python=3.5
    - setuptools


The following NEW packages will be INSTALLED:

  _libgcc_mutex      pkgs/main/linux-64::_libgcc_mutex-0.1-main
  blas               pkgs/free/linux-64::blas-1.0-mkl
  ca-certificates    pkgs/main/linux-64::ca-certificates-2019.5.15-0
  certifi            pkgs/main/linux-64::certifi-2018.8.24-py35_1
  intel-openmp       pkgs/main/linux-64::intel-openmp-2019.4-243
  libedit            pkgs/main/linux-64::libedit-3.1.20181209-hc058e9b_0
  libffi             pkgs/main/linux-64::libffi-3.2.1-hd88cf55_4
  libgcc-ng          pkgs/main/linux-64::libgcc-ng-9.1.0-hdf63c60_0
  libgfortran-ng     pkgs/main/linux-64::libgfortran-ng-7.3.0-hdf63c60_0
  libstdcxx-ng       pkgs/main/linux-64::libstdcxx-ng-9.1.0-hdf63c60_0
  mkl                pkgs/main/linux-64::mkl-2018.0.3-1
  mkl_fft            pkgs/main/linux-64::mkl_fft-1.0.6-py35h7dd41cf_0
  mkl_random         pkgs/main/linux-64::mkl_random-1.0.1-py35h4414c95_1
  ncurses            pkgs/main/linux-64::ncurses-6.1-he6710b0_1
  numpy              pkgs/main/linux-64::numpy-1.15.2-py35h1d66e8a_0
  numpy-base         pkgs/main/linux-64::numpy-base-1.15.2-py35h81de0dd_0
  openssl            pkgs/main/linux-64::openssl-1.0.2s-h7b6447c_0
  pandas             pkgs/main/linux-64::pandas-0.23.4-py35h04863e7_0
  pip                pkgs/main/linux-64::pip-10.0.1-py35_0
  python             pkgs/main/linux-64::python-3.5.6-hc3d631a_0
  python-dateutil    pkgs/main/linux-64::python-dateutil-2.7.3-py35_0
  pytz               pkgs/main/noarch::pytz-2019.1-py_0
  readline           pkgs/main/linux-64::readline-7.0-h7b6447c_5
  setuptools         pkgs/main/linux-64::setuptools-40.2.0-py35_0
  six                pkgs/main/linux-64::six-1.11.0-py35_1
  sqlite             pkgs/main/linux-64::sqlite-3.29.0-h7b6447c_0
  tk                 pkgs/main/linux-64::tk-8.6.8-hbc83047_0
  wheel              pkgs/main/linux-64::wheel-0.31.1-py35_0
  xz                 pkgs/main/linux-64::xz-5.2.4-h14c3975_4
  zlib               pkgs/main/linux-64::zlib-1.2.11-h7b6447c_3


Preparing transaction: ...working... done
Verifying transaction: ...working... done
Executing transaction: ...working... done
#
# To activate this environment, use
#
#     $ conda activate /tmp/tmp.US6uXYj7JT/3.5
#
# To deactivate an active environment, use
#
#     $ conda deactivate

Creating pip installable source dist
running sdist
running egg_info
creating pyspark.egg-info
writing dependency_links to pyspark.egg-info/dependency_links.txt
writing top-level names to pyspark.egg-info/top_level.txt
writing requirements to pyspark.egg-info/requires.txt
writing pyspark.egg-info/PKG-INFO
writing manifest file 'pyspark.egg-info/SOURCES.txt'
Could not import pypandoc - required to package PySpark
package init file 'deps/bin/__init__.py' not found (or not a regular file)
package init file 'deps/sbin/__init__.py' not found (or not a regular file)
package init file 'deps/jars/__init__.py' not found (or not a regular file)
package init file 'pyspark/python/pyspark/__init__.py' not found (or not a regular file)
package init file 'lib/__init__.py' not found (or not a regular file)
package init file 'deps/data/__init__.py' not found (or not a regular file)
package init file 'deps/licenses/__init__.py' not found (or not a regular file)
package init file 'deps/examples/__init__.py' not found (or not a regular file)
reading manifest file 'pyspark.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no previously-included files matching '*.py[cod]' found anywhere in distribution
warning: no previously-included files matching '__pycache__' found anywhere in distribution
warning: no previously-included files matching '.DS_Store' found anywhere in distribution
writing manifest file 'pyspark.egg-info/SOURCES.txt'
running check
creating pyspark-3.0.0.dev0
creating pyspark-3.0.0.dev0/deps
creating pyspark-3.0.0.dev0/deps/bin
creating pyspark-3.0.0.dev0/deps/data
creating pyspark-3.0.0.dev0/deps/data/graphx
creating pyspark-3.0.0.dev0/deps/data/mllib
creating pyspark-3.0.0.dev0/deps/data/mllib/als
creating pyspark-3.0.0.dev0/deps/data/mllib/images
creating pyspark-3.0.0.dev0/deps/data/mllib/images/origin
creating pyspark-3.0.0.dev0/deps/data/mllib/images/origin/kittens
creating pyspark-3.0.0.dev0/deps/data/mllib/images/partitioned
creating pyspark-3.0.0.dev0/deps/data/mllib/images/partitioned/cls=kittens
creating pyspark-3.0.0.dev0/deps/data/mllib/images/partitioned/cls=kittens/date=2018-01
creating pyspark-3.0.0.dev0/deps/data/mllib/ridge-data
creating pyspark-3.0.0.dev0/deps/data/streaming
creating pyspark-3.0.0.dev0/deps/examples
creating pyspark-3.0.0.dev0/deps/examples/ml
creating pyspark-3.0.0.dev0/deps/examples/mllib
creating pyspark-3.0.0.dev0/deps/examples/sql
creating pyspark-3.0.0.dev0/deps/examples/sql/streaming
creating pyspark-3.0.0.dev0/deps/examples/streaming
creating pyspark-3.0.0.dev0/deps/jars
creating pyspark-3.0.0.dev0/deps/licenses
creating pyspark-3.0.0.dev0/deps/sbin
creating pyspark-3.0.0.dev0/lib
creating pyspark-3.0.0.dev0/pyspark
creating pyspark-3.0.0.dev0/pyspark.egg-info
creating pyspark-3.0.0.dev0/pyspark/ml
creating pyspark-3.0.0.dev0/pyspark/ml/linalg
creating pyspark-3.0.0.dev0/pyspark/ml/param
creating pyspark-3.0.0.dev0/pyspark/mllib
creating pyspark-3.0.0.dev0/pyspark/mllib/linalg
creating pyspark-3.0.0.dev0/pyspark/mllib/stat
creating pyspark-3.0.0.dev0/pyspark/python
creating pyspark-3.0.0.dev0/pyspark/python/pyspark
creating pyspark-3.0.0.dev0/pyspark/sql
creating pyspark-3.0.0.dev0/pyspark/streaming
copying files to pyspark-3.0.0.dev0...
copying MANIFEST.in -> pyspark-3.0.0.dev0
copying README.md -> pyspark-3.0.0.dev0
copying setup.cfg -> pyspark-3.0.0.dev0
copying setup.py -> pyspark-3.0.0.dev0
copying deps/bin/beeline -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/beeline.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/docker-image-tool.sh -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/find-spark-home -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/find-spark-home.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/load-spark-env.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/load-spark-env.sh -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/pyspark -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/pyspark.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/pyspark2.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/run-example -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/run-example.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-class -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-class.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-class2.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-shell -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-shell.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-shell2.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-sql -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-sql.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-sql2.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-submit -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-submit.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/spark-submit2.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/sparkR -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/sparkR.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/bin/sparkR2.cmd -> pyspark-3.0.0.dev0/deps/bin
copying deps/data/graphx/followers.txt -> pyspark-3.0.0.dev0/deps/data/graphx
copying deps/data/graphx/users.txt -> pyspark-3.0.0.dev0/deps/data/graphx
copying deps/data/mllib/gmm_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/iris_libsvm.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/kmeans_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/pagerank_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/pic_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_binary_classification_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_fpgrowth.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_isotonic_regression_libsvm_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_kmeans_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_lda_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_lda_libsvm_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_libsvm_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_linear_regression_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_movielens_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_multiclass_classification_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_svm_data.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/streaming_kmeans_data_test.txt -> pyspark-3.0.0.dev0/deps/data/mllib
copying deps/data/mllib/als/sample_movielens_ratings.txt -> pyspark-3.0.0.dev0/deps/data/mllib/als
copying deps/data/mllib/als/test.data -> pyspark-3.0.0.dev0/deps/data/mllib/als
copying deps/data/mllib/images/license.txt -> pyspark-3.0.0.dev0/deps/data/mllib/images
copying deps/data/mllib/images/origin/license.txt -> pyspark-3.0.0.dev0/deps/data/mllib/images/origin
copying deps/data/mllib/images/origin/kittens/not-image.txt -> pyspark-3.0.0.dev0/deps/data/mllib/images/origin/kittens
copying deps/data/mllib/images/partitioned/cls=kittens/date=2018-01/not-image.txt -> pyspark-3.0.0.dev0/deps/data/mllib/images/partitioned/cls=kittens/date=2018-01
copying deps/data/mllib/ridge-data/lpsa.data -> pyspark-3.0.0.dev0/deps/data/mllib/ridge-data
copying deps/data/streaming/AFINN-111.txt -> pyspark-3.0.0.dev0/deps/data/streaming
copying deps/examples/als.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/avro_inputformat.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/kmeans.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/logistic_regression.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/pagerank.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/parquet_inputformat.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/pi.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/sort.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/status_api_demo.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/transitive_closure.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/wordcount.py -> pyspark-3.0.0.dev0/deps/examples
copying deps/examples/ml/aft_survival_regression.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/als_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/binarizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/bisecting_k_means_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/bucketed_random_projection_lsh_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/bucketizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/chi_square_test_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/chisq_selector_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/correlation_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/count_vectorizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/cross_validator.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/dataframe_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/dct_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/decision_tree_classification_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/decision_tree_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/elementwise_product_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/estimator_transformer_param_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/feature_hasher_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/fpgrowth_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/gaussian_mixture_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/generalized_linear_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/gradient_boosted_tree_classifier_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/gradient_boosted_tree_regressor_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/imputer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/index_to_string_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/interaction_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/isotonic_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/kmeans_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/lda_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/linear_regression_with_elastic_net.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/linearsvc.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/logistic_regression_summary_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/logistic_regression_with_elastic_net.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/max_abs_scaler_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/min_hash_lsh_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/min_max_scaler_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/multiclass_logistic_regression_with_elastic_net.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/multilayer_perceptron_classification.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/n_gram_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/naive_bayes_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/normalizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/one_vs_rest_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/onehot_encoder_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/pca_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/pipeline_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/polynomial_expansion_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/power_iteration_clustering_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/prefixspan_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/quantile_discretizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/random_forest_classifier_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/random_forest_regressor_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/rformula_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/sql_transformer.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/standard_scaler_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/stopwords_remover_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/string_indexer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/summarizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/tf_idf_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/tokenizer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/train_validation_split.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_assembler_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_indexer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_size_hint_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_slicer_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/ml/word2vec_example.py -> pyspark-3.0.0.dev0/deps/examples/ml
copying deps/examples/mllib/binary_classification_metrics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/bisecting_k_means_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/correlations.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/correlations_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/decision_tree_classification_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/decision_tree_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/elementwise_product_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/fpgrowth_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gaussian_mixture_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gaussian_mixture_model.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gradient_boosting_classification_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gradient_boosting_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/hypothesis_testing_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/hypothesis_testing_kolmogorov_smirnov_test_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/isotonic_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/k_means_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/kernel_density_estimation_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/kmeans.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/latent_dirichlet_allocation_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/linear_regression_with_sgd_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/logistic_regression.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/logistic_regression_with_lbfgs_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/multi_class_metrics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/multi_label_metrics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/naive_bayes_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/normalizer_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/pca_rowmatrix_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/power_iteration_clustering_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/random_forest_classification_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/random_forest_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/random_rdd_generation.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/ranking_metrics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/recommendation_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/regression_metrics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/sampled_rdds.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/standard_scaler_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/stratified_sampling_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/streaming_k_means_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/streaming_linear_regression_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/summary_statistics_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/svd_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/svm_with_sgd_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/tf_idf_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/word2vec.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/mllib/word2vec_example.py -> pyspark-3.0.0.dev0/deps/examples/mllib
copying deps/examples/sql/arrow.py -> pyspark-3.0.0.dev0/deps/examples/sql
copying deps/examples/sql/basic.py -> pyspark-3.0.0.dev0/deps/examples/sql
copying deps/examples/sql/datasource.py -> pyspark-3.0.0.dev0/deps/examples/sql
copying deps/examples/sql/hive.py -> pyspark-3.0.0.dev0/deps/examples/sql
copying deps/examples/sql/streaming/structured_kafka_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/sql/streaming
copying deps/examples/sql/streaming/structured_network_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/sql/streaming
copying deps/examples/sql/streaming/structured_network_wordcount_windowed.py -> pyspark-3.0.0.dev0/deps/examples/sql/streaming
copying deps/examples/streaming/hdfs_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/network_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/network_wordjoinsentiments.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/queue_stream.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/recoverable_network_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/sql_network_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/examples/streaming/stateful_network_wordcount.py -> pyspark-3.0.0.dev0/deps/examples/streaming
copying deps/jars/JavaEWAH-0.3.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/RoaringBitmap-0.7.45.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/ST4-4.0.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/activation-1.1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/aircompressor-0.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/antlr-2.7.7.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/antlr-runtime-3.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/antlr4-runtime-4.7.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/aopalliance-1.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/aopalliance-repackaged-2.4.0-b34.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/apache-log4j-extras-1.2.17.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/apacheds-i18n-2.0.0-M15.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/apacheds-kerberos-codec-2.0.0-M15.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/api-asn1-api-1.0.0-M20.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/api-util-1.0.0-M20.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/arpack_combined_all-0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/arrow-format-0.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/arrow-memory-0.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/arrow-vector-0.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/automaton-1.11-8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/avro-1.8.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/avro-ipc-1.8.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/avro-mapred-1.8.2-hadoop2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/aws-java-sdk-1.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/azure-storage-2.0.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/bonecp-0.8.0.RELEASE.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/breeze-macros_2.12-0.13.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/breeze_2.12-0.13.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/cglib-2.2.1-v20090111.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/chill-java-0.9.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/chill_2.12-0.9.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-beanutils-1.7.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-cli-1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-codec-1.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-collections-3.2.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-compiler-3.0.13.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-compress-1.8.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-configuration-1.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-crypto-1.0.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-dbcp-1.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-digester-1.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-httpclient-3.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-io-2.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-lang-2.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-lang3-3.8.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-logging-1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-math3-3.4.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-net-3.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/commons-pool-1.5.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/compress-lzf-1.0.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/core-1.1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/curator-client-2.7.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/curator-framework-2.7.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/curator-recipes-2.7.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/datanucleus-api-jdo-3.2.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/datanucleus-core-3.2.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/datanucleus-rdbms-3.2.9.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/derby-10.12.1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/flatbuffers-java-1.9.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/generex-1.0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/gmetric4j-1.0.7.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/gson-2.2.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/guava-14.0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/guice-3.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-annotations-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-auth-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-aws-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-azure-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-client-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-common-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-hdfs-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-app-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-common-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-core-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-jobclient-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-shuffle-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-openstack-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-api-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-client-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-common-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-server-common-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-server-web-proxy-2.7.4.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hive-beeline-1.2.1.spark2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hive-cli-1.2.1.spark2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hive-exec-1.2.1.spark2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hive-jdbc-1.2.1.spark2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hive-metastore-1.2.1.spark2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hk2-api-2.4.0-b34.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hk2-locator-2.4.0-b34.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hk2-utils-2.4.0-b34.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/hppc-0.7.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/htrace-core-3.1.0-incubating.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/httpclient-4.5.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/httpcore-4.4.10.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/ivy-2.4.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-annotations-2.9.9.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-core-2.9.9.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-core-asl-1.9.13.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-databind-2.9.9.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-dataformat-cbor-2.9.9.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-dataformat-yaml-2.9.9.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-jaxrs-1.9.13.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-mapper-asl-1.9.13.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-module-jaxb-annotations-2.9.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-module-paranamer-2.9.9.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-module-scala_2.12-2.9.9.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jackson-xc-1.9.13.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/janino-3.0.13.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javassist-3.18.1-GA.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javax.annotation-api-1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javax.inject-1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javax.inject-2.4.0-b34.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javax.servlet-api-3.1.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javax.ws.rs-api-2.0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/javolution-5.5.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jaxb-api-2.2.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jcl-over-slf4j-1.7.16.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jdo-api-3.0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-client-2.22.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-common-2.22.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-container-servlet-2.22.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-container-servlet-core-2.22.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-guava-2.22.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-media-jaxb-2.22.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jersey-server-2.22.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jettison-1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-6.1.26.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-client-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-continuation-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-http-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-io-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-jndi-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-plus-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-proxy-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-security-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-server-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-servlet-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-servlets-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-sslengine-6.1.26.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-util-6.1.26.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-util-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-webapp-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jetty-xml-9.4.18.v20190429.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jline-2.14.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/joda-time-2.9.9.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jodd-core-3.5.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jpam-1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/json4s-ast_2.12-3.6.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/json4s-core_2.12-3.6.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/json4s-jackson_2.12-3.6.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/json4s-scalap_2.12-3.6.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jsp-api-2.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jsr305-3.0.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jta-1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jtransforms-2.4.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/jul-to-slf4j-1.7.16.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/kryo-shaded-4.0.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/kubernetes-client-4.1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/kubernetes-model-4.1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/kubernetes-model-common-4.1.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/leveldbjni-all-1.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/libfb303-0.9.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/libthrift-0.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/log4j-1.2.17.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/logging-interceptor-3.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/lz4-java-1.6.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/machinist_2.12-0.6.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/macro-compat_2.12-1.1.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/mesos-1.4.0-shaded-protobuf.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/metrics-core-3.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/metrics-ganglia-3.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/metrics-graphite-3.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/metrics-json-3.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/metrics-jvm-3.1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/minlog-1.3.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/netty-3.9.9.Final.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/netty-all-4.1.30.Final.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/objenesis-2.5.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/okapi-shade-0.4.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/okhttp-3.12.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/okio-1.15.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/oncrpc-1.0.7.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/opencsv-2.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/orc-core-1.5.5-nohive.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/orc-mapreduce-1.5.5-nohive.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/orc-shims-1.5.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/oro-2.0.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/osgi-resource-locator-1.0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/paranamer-2.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-column-1.10.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-common-1.10.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-encoding-1.10.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-format-2.4.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-hadoop-1.10.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-hadoop-bundle-1.6.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/parquet-jackson-1.10.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/pmml-model-1.4.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/protobuf-java-2.5.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/py4j-0.10.8.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/pyrolite-4.30.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-compiler-2.12.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-library-2.12.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-parser-combinators_2.12-1.1.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-reflect-2.12.8.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/scala-xml_2.12-1.2.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/shapeless_2.12-2.3.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/shims-0.7.45.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/slf4j-api-1.7.25.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/slf4j-log4j12-1.7.16.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/snakeyaml-1.23.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/snappy-0.2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/snappy-java-1.1.7.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-assembly_2.12-3.0.0-SNAPSHOT-tests.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-assembly_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-catalyst_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-core_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-cypher_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-ganglia-lgpl_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-graph-api_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-graph_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-graphx_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-hadoop-cloud_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-hive-thriftserver_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-hive_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-kubernetes_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-kvstore_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-launcher_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-mesos_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-mllib-local_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-mllib_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-network-common_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-network-shuffle_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-repl_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-sketch_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-sql_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-streaming_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-tags_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-unsafe_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spark-yarn_2.12-3.0.0-SNAPSHOT.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spire-macros_2.12-0.13.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/spire_2.12-0.13.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/stax-api-1.0-2.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/stax-api-1.0.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/stream-2.9.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/stringtemplate-3.2.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/super-csv-2.2.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/univocity-parsers-2.7.3.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/unused-1.0.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/validation-api-1.1.0.Final.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/xbean-asm7-shaded-4.14.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/xercesImpl-2.9.1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/xml-apis-1.3.04.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/xmlenc-0.52.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/xz-1.5.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/zjsonpatch-0.3.0.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/zookeeper-3.4.6.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/jars/zstd-jni-1.4.0-1.jar -> pyspark-3.0.0.dev0/deps/jars
copying deps/licenses/LICENSE-AnchorJS.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-CC0.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-bootstrap.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-cloudpickle.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-copybutton.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-d3.min.js.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-dagre-d3.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-datatables.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-graphlib-dot.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-heapq.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-join.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-jquery.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-json-formatter.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-matchMedia-polyfill.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-modernizr.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-mustache.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-py4j.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-respond.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-sbt-launch-lib.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-sorttable.js.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/licenses/LICENSE-vis.txt -> pyspark-3.0.0.dev0/deps/licenses
copying deps/sbin/spark-config.sh -> pyspark-3.0.0.dev0/deps/sbin
copying deps/sbin/spark-daemon.sh -> pyspark-3.0.0.dev0/deps/sbin
copying deps/sbin/start-history-server.sh -> pyspark-3.0.0.dev0/deps/sbin
copying deps/sbin/stop-history-server.sh -> pyspark-3.0.0.dev0/deps/sbin
copying lib/py4j-0.10.8.1-src.zip -> pyspark-3.0.0.dev0/lib
copying lib/pyspark.zip -> pyspark-3.0.0.dev0/lib
copying pyspark/__init__.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/_globals.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/accumulators.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/broadcast.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/cloudpickle.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/conf.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/context.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/daemon.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/files.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/find_spark_home.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/heapq3.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/java_gateway.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/join.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/profiler.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/rdd.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/rddsampler.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/resourceinformation.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/resultiterable.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/serializers.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/shell.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/shuffle.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/statcounter.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/status.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/storagelevel.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/taskcontext.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/traceback_utils.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/util.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/version.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark/worker.py -> pyspark-3.0.0.dev0/pyspark
copying pyspark.egg-info/PKG-INFO -> pyspark-3.0.0.dev0/pyspark.egg-info
copying pyspark.egg-info/SOURCES.txt -> pyspark-3.0.0.dev0/pyspark.egg-info
copying pyspark.egg-info/dependency_links.txt -> pyspark-3.0.0.dev0/pyspark.egg-info
copying pyspark.egg-info/requires.txt -> pyspark-3.0.0.dev0/pyspark.egg-info
copying pyspark.egg-info/top_level.txt -> pyspark-3.0.0.dev0/pyspark.egg-info
copying pyspark/ml/__init__.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/base.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/classification.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/clustering.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/common.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/evaluation.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/feature.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/fpm.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/image.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/pipeline.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/recommendation.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/regression.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/stat.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/tuning.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/util.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/wrapper.py -> pyspark-3.0.0.dev0/pyspark/ml
copying pyspark/ml/linalg/__init__.py -> pyspark-3.0.0.dev0/pyspark/ml/linalg
copying pyspark/ml/param/__init__.py -> pyspark-3.0.0.dev0/pyspark/ml/param
copying pyspark/ml/param/_shared_params_code_gen.py -> pyspark-3.0.0.dev0/pyspark/ml/param
copying pyspark/ml/param/shared.py -> pyspark-3.0.0.dev0/pyspark/ml/param
copying pyspark/mllib/__init__.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/classification.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/clustering.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/common.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/evaluation.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/feature.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/fpm.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/random.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/recommendation.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/regression.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/tree.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/util.py -> pyspark-3.0.0.dev0/pyspark/mllib
copying pyspark/mllib/linalg/__init__.py -> pyspark-3.0.0.dev0/pyspark/mllib/linalg
copying pyspark/mllib/linalg/distributed.py -> pyspark-3.0.0.dev0/pyspark/mllib/linalg
copying pyspark/mllib/stat/KernelDensity.py -> pyspark-3.0.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/__init__.py -> pyspark-3.0.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/_statistics.py -> pyspark-3.0.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/distribution.py -> pyspark-3.0.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/test.py -> pyspark-3.0.0.dev0/pyspark/mllib/stat
copying pyspark/python/pyspark/shell.py -> pyspark-3.0.0.dev0/pyspark/python/pyspark
copying pyspark/sql/__init__.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/catalog.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/column.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/conf.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/context.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/dataframe.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/functions.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/group.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/readwriter.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/session.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/streaming.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/types.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/udf.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/utils.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/sql/window.py -> pyspark-3.0.0.dev0/pyspark/sql
copying pyspark/streaming/__init__.py -> pyspark-3.0.0.dev0/pyspark/streaming
copying pyspark/streaming/context.py -> pyspark-3.0.0.dev0/pyspark/streaming
copying pyspark/streaming/dstream.py -> pyspark-3.0.0.dev0/pyspark/streaming
copying pyspark/streaming/kinesis.py -> pyspark-3.0.0.dev0/pyspark/streaming
copying pyspark/streaming/listener.py -> pyspark-3.0.0.dev0/pyspark/streaming
copying pyspark/streaming/util.py -> pyspark-3.0.0.dev0/pyspark/streaming
Writing pyspark-3.0.0.dev0/setup.cfg
Creating tar archive
removing 'pyspark-3.0.0.dev0' (and everything under it)
Installing dist into virtual env
Obtaining file:///home/jenkins/workspace/SparkPullRequestBuilder/python
Collecting py4j==0.10.8.1 (from pyspark==3.0.0.dev0)
  Downloading https://files.pythonhosted.org/packages/04/de/2d314a921ef4c20b283e1de94e0780273678caac901564df06b948e4ba9b/py4j-0.10.8.1-py2.py3-none-any.whl (196kB)
mkl-random 1.0.1 requires cython, which is not installed.
Installing collected packages: py4j, pyspark
  Running setup.py develop for pyspark
Successfully installed py4j-0.10.8.1 pyspark
You are using pip version 10.0.1, however version 19.1.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
Run basic sanity check on pip installed version with spark-submit
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: target/unit-tests.log (No such file or directory)
	at java.io.FileOutputStream.open0(Native Method)
	at java.io.FileOutputStream.open(FileOutputStream.java:270)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
	at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
	at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
	at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
	at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
	at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
	at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
	at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
	at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
	at org.apache.spark.internal.Logging.initializeLogging(Logging.scala:123)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary(Logging.scala:111)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary$(Logging.scala:105)
	at org.apache.spark.deploy.SparkSubmit.initializeLogIfNecessary(SparkSubmit.scala:74)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:82)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:999)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1008)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Successfully ran pip sanity check
Run basic sanity check with import based
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: target/unit-tests.log (No such file or directory)
	at java.io.FileOutputStream.open0(Native Method)
	at java.io.FileOutputStream.open(FileOutputStream.java:270)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
	at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
	at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
	at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
	at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
	at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
	at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
	at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
	at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
	at org.apache.spark.internal.Logging.initializeLogging(Logging.scala:123)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary(Logging.scala:111)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary$(Logging.scala:105)
	at org.apache.spark.deploy.SparkSubmit.initializeLogIfNecessary(SparkSubmit.scala:74)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:82)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:999)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1008)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

[Stage 0:>                                                         (0 + 0) / 10]
[Stage 0:>                                                        (0 + 10) / 10]
                                                                                
Successfully ran pip sanity check
Run the tests for context.py
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: target/unit-tests.log (No such file or directory)
	at java.io.FileOutputStream.open0(Native Method)
	at java.io.FileOutputStream.open(FileOutputStream.java:270)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
	at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
	at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
	at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
	at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
	at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
	at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
	at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
	at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
	at org.apache.spark.internal.Logging.initializeLogging(Logging.scala:123)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary(Logging.scala:111)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary$(Logging.scala:105)
	at org.apache.spark.deploy.SparkSubmit.initializeLogIfNecessary(SparkSubmit.scala:74)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:82)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:999)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1008)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: target/unit-tests.log (No such file or directory)
	at java.io.FileOutputStream.open0(Native Method)
	at java.io.FileOutputStream.open(FileOutputStream.java:270)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
	at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
	at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
	at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
	at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
	at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
	at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
	at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
	at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
	at org.apache.spark.internal.Logging.initializeLogging(Logging.scala:123)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary(Logging.scala:111)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary$(Logging.scala:105)
	at org.apache.spark.deploy.SparkSubmit.initializeLogIfNecessary(SparkSubmit.scala:74)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:82)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:999)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1008)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

[Stage 0:>                                                          (0 + 4) / 4]
[Stage 0:==============>                                            (1 + 3) / 4]
                                                                                

[Stage 10:>                                                         (0 + 4) / 4]
[Stage 10:>                 (0 + 4) / 4][Stage 11:>                 (0 + 0) / 2]
                                                                                
DeprecationWarning: 'source deactivate' is deprecated. Use 'conda deactivate'.
Cleaning up temporary directory - /tmp/tmp.US6uXYj7JT

========================================================================
Running SparkR tests
========================================================================
During startup - Warning message:
In .First() :
  Support for R prior to version 3.4 is deprecated since Spark 3.0.0
Loading required package: methods

Attaching package: ���SparkR���

The following objects are masked from ���package:testthat���:

    describe, not

The following objects are masked from ���package:stats���:

    cov, filter, lag, na.omit, predict, sd, var, window

The following objects are masked from ���package:base���:

    as.data.frame, colnames, colnames<-, drop, intersect, rank, rbind,
    sample, subset, summary, transform, union

Spark package found in SPARK_HOME: /home/jenkins/workspace/SparkPullRequestBuilder
basic tests for CRAN: .............

DONE ===========================================================================
binary functions: Spark package found in SPARK_HOME: /home/jenkins/workspace/SparkPullRequestBuilder
...........
functions on binary files: Spark package found in SPARK_HOME: /home/jenkins/workspace/SparkPullRequestBuilder
....
broadcast variables: Spark package found in SPARK_HOME: /home/jenkins/workspace/SparkPullRequestBuilder
..
functions in client.R: .....
test functions in sparkR.R: ..........................................
include R packages: Spark package found in SPARK_HOME: /home/jenkins/workspace/SparkPullRequestBuilder

JVM API: Spark package found in SPARK_HOME: /home/jenkins/workspace/SparkPullRequestBuilder
..
MLlib classification algorithms, except for tree-based algorithms: Spark package found in SPARK_HOME: /home/jenkins/workspace/SparkPullRequestBuilder
.........Attempting to post to Github...
 > Post successful.
Process leaked file descriptors. See http://wiki.jenkins-ci.org/display/JENKINS/Spawning+processes+from+build for more information
Build step 'Execute shell' marked build as failure
1Archiving artifacts
Recording test results
Test FAILed.
Refer to this link for build results (access rights to CI server needed): 
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/107978/
Test FAILed.
Finished: FAILURE