Skip to content

Commit 2ae3301

Browse files
HyukjinKwonJackey Lee
authored andcommitted
[HOTFIX] Fix PySpark pip packaging tests by non-ascii compatible character
## What changes were proposed in this pull request? PIP installation requires to package bin scripts together. https://github.com/apache/spark/blob/master/python/setup.py#L71 The recent fix introduced non-ascii compatible (non-breackable space I guess) at apache@ec96d34 fix. This is usually not the problem but looks Jenkins's default encoding is `ascii` and during copying the script, there looks implicit conversion between bytes and strings - where the default encoding is used https://github.com/pypa/setuptools/blob/v40.4.3/setuptools/command/develop.py#L185-L189 ## How was this patch tested? Jenkins Closes apache#22782 from HyukjinKwon/pip-failure-fix. Authored-by: hyukjinkwon <gurwls223@apache.org> Signed-off-by: hyukjinkwon <gurwls223@apache.org>
1 parent 2df8e7a commit 2ae3301

1 file changed

Lines changed: 1 addition & 1 deletion

File tree

bin/docker-image-tool.sh

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -79,7 +79,7 @@ function build {
7979
fi
8080

8181
# Verify that Spark has actually been built/is a runnable distribution
82-
# i.e. the Spark JARs that the Docker files will place into the image are present
82+
# i.e. the Spark JARs that the Docker files will place into the image are present
8383
local TOTAL_JARS=$(ls $JARS/spark-* | wc -l)
8484
TOTAL_JARS=$(( $TOTAL_JARS ))
8585
if [ "${TOTAL_JARS}" -eq 0 ]; then

0 commit comments

Comments
 (0)