Skip to content

Conversation

@HyukjinKwon
Copy link
Member

@HyukjinKwon HyukjinKwon commented May 3, 2019

What changes were proposed in this pull request?

This PR partially reverts #20691

After we changed the Python protocol to highest ones, seems like it introduced a correctness bug. This potentially affects all Python related code paths.

I suspect a bug related to Pryolite (maybe opcodes MEMOIZE, FRAME and/or our RowPickler). I would like to stick to default protocol for now and investigate the issue separately.

I will separately investigate later to bring highest protocol back.

Update:
I can almost confirm that this is a bug in Pyrolite.

  1. Python lists are chunked via BatchedSerializer at parallelize (in createDataFrame), for instance, as below (each line is each batch):

    [[1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4]]
    [[1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4]]
    ...
    [[1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4]]

    Here is opcodes for each batch (separated by ===..) with protocol 4:

    Details
    ==============================
        0: \x80 PROTO      4
        2: \x95 FRAME      47
       11: ]    EMPTY_LIST
       12: \x94 MEMOIZE    (as 0)
       13: (    MARK
       14: ]        EMPTY_LIST
       15: \x94     MEMOIZE    (as 1)
       16: (        MARK
       17: K            BININT1    1
       19: K            BININT1    2
       21: K            BININT1    3
       23: K            BININT1    4
       25: e            APPENDS    (MARK at 16)
       26: \x85     TUPLE1
       27: \x94     MEMOIZE    (as 2)
       28: h        BINGET     1
       30: \x85     TUPLE1
       31: \x94     MEMOIZE    (as 3)
       32: h        BINGET     1
       34: \x85     TUPLE1
       35: \x94     MEMOIZE    (as 4)
       36: h        BINGET     1
       38: \x85     TUPLE1
       39: \x94     MEMOIZE    (as 5)
       40: h        BINGET     1
       42: \x85     TUPLE1
       43: \x94     MEMOIZE    (as 6)
       44: h        BINGET     1
       46: \x85     TUPLE1
       47: \x94     MEMOIZE    (as 7)
       48: h        BINGET     1
       50: \x85     TUPLE1
       51: \x94     MEMOIZE    (as 8)
       52: h        BINGET     1
       54: \x85     TUPLE1
       55: \x94     MEMOIZE    (as 9)
       56: e        APPENDS    (MARK at 13)
       57: .    STOP
    highest protocol among opcodes = 4
    ==============================
    ==============================
        0: \x80 PROTO      4
        2: \x95 FRAME      47
       11: ]    EMPTY_LIST
       12: \x94 MEMOIZE    (as 0)
       13: (    MARK
       14: ]        EMPTY_LIST
       15: \x94     MEMOIZE    (as 1)
       16: (        MARK
       17: K            BININT1    1
       19: K            BININT1    2
       21: K            BININT1    3
       23: K            BININT1    4
       25: e            APPENDS    (MARK at 16)
       26: \x85     TUPLE1
       27: \x94     MEMOIZE    (as 2)
       28: h        BINGET     1
       30: \x85     TUPLE1
       31: \x94     MEMOIZE    (as 3)
       32: h        BINGET     1
       34: \x85     TUPLE1
       35: \x94     MEMOIZE    (as 4)
       36: h        BINGET     1
       38: \x85     TUPLE1
       39: \x94     MEMOIZE    (as 5)
       40: h        BINGET     1
       42: \x85     TUPLE1
       43: \x94     MEMOIZE    (as 6)
       44: h        BINGET     1
       46: \x85     TUPLE1
       47: \x94     MEMOIZE    (as 7)
       48: h        BINGET     1
       50: \x85     TUPLE1
       51: \x94     MEMOIZE    (as 8)
       52: h        BINGET     1
       54: \x85     TUPLE1
       55: \x94     MEMOIZE    (as 9)
       56: e        APPENDS    (MARK at 13)
       57: .    STOP
    highest protocol among opcodes = 4
    ==============================
    ...
    ==============================
        0: \x80 PROTO      4
        2: \x95 FRAME      31
       11: ]    EMPTY_LIST
       12: \x94 MEMOIZE    (as 0)
       13: (    MARK
       14: ]        EMPTY_LIST
       15: \x94     MEMOIZE    (as 1)
       16: (        MARK
       17: K            BININT1    1
       19: K            BININT1    2
       21: K            BININT1    3
       23: K            BININT1    4
       25: e            APPENDS    (MARK at 16)
       26: \x85     TUPLE1
       27: \x94     MEMOIZE    (as 2)
       28: h        BINGET     1
       30: \x85     TUPLE1
       31: \x94     MEMOIZE    (as 3)
       32: h        BINGET     1
       34: \x85     TUPLE1
       35: \x94     MEMOIZE    (as 4)
       36: h        BINGET     1
       38: \x85     TUPLE1
       39: \x94     MEMOIZE    (as 5)
       40: e        APPENDS    (MARK at 13)
       41: .    STOP
    highest protocol among opcodes = 4
    ==============================
    
  2. Those batches become binary in an RDD at parallelize - so far results look fine

  3. rdd._to_java_object_rdd -> SerDeUtil.pythonToJava -> Unpickler.loads

  4. Here, Pryolite Unpickler.loads converts each Python object batch into Java objects

  5. The converted Java objects look a bit odd from Pryolite as below (see the last three lists):

    [1, 2, 3, 4]
    [1, 2, 3, 4]
    ...
    [[[1, 2, 3, 4]], [[1, 2, 3, 4]]]
    [[[1, 2, 3, 4]], [[1, 2, 3, 4]]]
    [[[1, 2, 3, 4]], [[1, 2, 3, 4]]]
    
  6. Last three nested lists are ignored via EvaluatePython.makeFromJava and becomes null -> None.

  7. So the Jenkins test was failed at [SPARK-27612][PYTHON] Use Python's default protocol instead of highest protocol #24519 (comment) as below:

    Traceback (most recent call last):
      File "/home/jenkins/workspace/SparkPullRequestBuilder/python/pyspark/sql/tests/test_serde.py", line 134, in test_int_array_serialization
        self.assertEqual(len(list(filter(lambda r: None in r.value, df.collect()))), 0)
    AssertionError: 3 != 0
    

How was this patch tested?

Unittest was added.

./run-tests --python-executables=python3.7 --testname "pyspark.sql.tests.test_serde SerdeTests.test_int_array_serialization"

@HyukjinKwon
Copy link
Member Author

cc @BryanCutler and @viirya

This PR could not identify the root cause yet (but only my rough hypothesis). I would like to get this in first and then do the investigation separately.

One concern about using default protocol is that it's not tested in cloudpickle. I opened a PR to add a env to control that and test it (cloudpipe/cloudpickle#265).

basestring = unicode = str
xrange = range
pickle_protocol = pickle.HIGHEST_PROTOCOL
pickle_protocol = 3
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can use pickle.DEFAULT_PROTOCOL too but let me stick with constant since seems protocol 4 has this bug.

Copy link
Member

@dongjoon-hyun dongjoon-hyun May 3, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks nice and solid. BTW, do you think we can have a pointer for the upstream bug issue against pickle?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you mean the bug issue related to the root cause somewhere? (I think) it's more like an issue within Pryolite library .. I am not 100% sure yet. I will update that when I have it. I am looking into this to identify the cause.

@SparkQA
Copy link

SparkQA commented May 3, 2019

Test build #105096 has finished for PR 24519 at commit 1de0478.

  • This patch fails PySpark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

Copy link
Member

@viirya viirya left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agreed that this can be a temporary fix for the correctness issue. We can investigate it more for root cause after this.

@viirya
Copy link
Member

viirya commented May 3, 2019

The test failed? Can't default protocol fix this?

@HyukjinKwon
Copy link
Member Author

HyukjinKwon commented May 3, 2019

Oh, #24519 (comment) was intended to be failed (1de0478). I wanted to make sure it's failed in Jenkins too.

@dongjoon-hyun
Copy link
Member

The last commit is still under testing~

@viirya
Copy link
Member

viirya commented May 3, 2019

Oh, don't see that. Thanks.

@SparkQA
Copy link

SparkQA commented May 3, 2019

Test build #105097 has finished for PR 24519 at commit 3ab6eb2.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@HyukjinKwon
Copy link
Member Author

HyukjinKwon commented May 3, 2019

I can almost confirm that this is a bug in Pyrolite.

  1. Python lists are chunked via BatchedSerializer at parallelize (in createDataFrame), for instance, as below (each line is each batch):

    [[1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4]]
    [[1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4]]
    ...
    [[1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4]]

    Here is opcodes for each batch (separated by ===..) with protocol 4:

    Details
    ==============================
        0: \x80 PROTO      4
        2: \x95 FRAME      47
       11: ]    EMPTY_LIST
       12: \x94 MEMOIZE    (as 0)
       13: (    MARK
       14: ]        EMPTY_LIST
       15: \x94     MEMOIZE    (as 1)
       16: (        MARK
       17: K            BININT1    1
       19: K            BININT1    2
       21: K            BININT1    3
       23: K            BININT1    4
       25: e            APPENDS    (MARK at 16)
       26: \x85     TUPLE1
       27: \x94     MEMOIZE    (as 2)
       28: h        BINGET     1
       30: \x85     TUPLE1
       31: \x94     MEMOIZE    (as 3)
       32: h        BINGET     1
       34: \x85     TUPLE1
       35: \x94     MEMOIZE    (as 4)
       36: h        BINGET     1
       38: \x85     TUPLE1
       39: \x94     MEMOIZE    (as 5)
       40: h        BINGET     1
       42: \x85     TUPLE1
       43: \x94     MEMOIZE    (as 6)
       44: h        BINGET     1
       46: \x85     TUPLE1
       47: \x94     MEMOIZE    (as 7)
       48: h        BINGET     1
       50: \x85     TUPLE1
       51: \x94     MEMOIZE    (as 8)
       52: h        BINGET     1
       54: \x85     TUPLE1
       55: \x94     MEMOIZE    (as 9)
       56: e        APPENDS    (MARK at 13)
       57: .    STOP
    highest protocol among opcodes = 4
    ==============================
    ==============================
        0: \x80 PROTO      4
        2: \x95 FRAME      47
       11: ]    EMPTY_LIST
       12: \x94 MEMOIZE    (as 0)
       13: (    MARK
       14: ]        EMPTY_LIST
       15: \x94     MEMOIZE    (as 1)
       16: (        MARK
       17: K            BININT1    1
       19: K            BININT1    2
       21: K            BININT1    3
       23: K            BININT1    4
       25: e            APPENDS    (MARK at 16)
       26: \x85     TUPLE1
       27: \x94     MEMOIZE    (as 2)
       28: h        BINGET     1
       30: \x85     TUPLE1
       31: \x94     MEMOIZE    (as 3)
       32: h        BINGET     1
       34: \x85     TUPLE1
       35: \x94     MEMOIZE    (as 4)
       36: h        BINGET     1
       38: \x85     TUPLE1
       39: \x94     MEMOIZE    (as 5)
       40: h        BINGET     1
       42: \x85     TUPLE1
       43: \x94     MEMOIZE    (as 6)
       44: h        BINGET     1
       46: \x85     TUPLE1
       47: \x94     MEMOIZE    (as 7)
       48: h        BINGET     1
       50: \x85     TUPLE1
       51: \x94     MEMOIZE    (as 8)
       52: h        BINGET     1
       54: \x85     TUPLE1
       55: \x94     MEMOIZE    (as 9)
       56: e        APPENDS    (MARK at 13)
       57: .    STOP
    highest protocol among opcodes = 4
    ==============================
    ...
    ==============================
        0: \x80 PROTO      4
        2: \x95 FRAME      31
       11: ]    EMPTY_LIST
       12: \x94 MEMOIZE    (as 0)
       13: (    MARK
       14: ]        EMPTY_LIST
       15: \x94     MEMOIZE    (as 1)
       16: (        MARK
       17: K            BININT1    1
       19: K            BININT1    2
       21: K            BININT1    3
       23: K            BININT1    4
       25: e            APPENDS    (MARK at 16)
       26: \x85     TUPLE1
       27: \x94     MEMOIZE    (as 2)
       28: h        BINGET     1
       30: \x85     TUPLE1
       31: \x94     MEMOIZE    (as 3)
       32: h        BINGET     1
       34: \x85     TUPLE1
       35: \x94     MEMOIZE    (as 4)
       36: h        BINGET     1
       38: \x85     TUPLE1
       39: \x94     MEMOIZE    (as 5)
       40: e        APPENDS    (MARK at 13)
       41: .    STOP
    highest protocol among opcodes = 4
    ==============================
    
  2. Those batches become binary in an RDD at parallelize - so far results look fine

  3. rdd._to_java_object_rdd -> SerDeUtil.pythonToJava -> Unpickler.loads

  4. Here, Pryolite Unpickler.loads converts each Python object batch into Java objects

  5. The converted Java objects look a bit odd from Pryolite as below (see the last three lists):

    [1, 2, 3, 4]
    [1, 2, 3, 4]
    ...
    [[[1, 2, 3, 4]], [[1, 2, 3, 4]]]
    [[[1, 2, 3, 4]], [[1, 2, 3, 4]]]
    [[[1, 2, 3, 4]], [[1, 2, 3, 4]]]
    
  6. Last three nested lists are ignored via EvaluatePython.makeFromJava and becomes null -> None.

  7. So the Jenkins test was failed at [SPARK-27612][PYTHON] Use Python's default protocol instead of highest protocol #24519 (comment) as below:

    Traceback (most recent call last):
      File "/home/jenkins/workspace/SparkPullRequestBuilder/python/pyspark/sql/tests/test_serde.py", line 134, in test_int_array_serialization
        self.assertEqual(len(list(filter(lambda r: None in r.value, df.collect()))), 0)
    AssertionError: 3 != 0
    

@HyukjinKwon
Copy link
Member Author

Not sure if there's quick fix because notable opcodes that appeared from both protocol 4 and current reproducer is only MEMORIZE. We should peek memo at Unpickler which is private in Pryolite 4.13. It's protected in 4.17 so maybe we can later upgrade Pryolite and debug further.

@HyukjinKwon
Copy link
Member Author

Merged to master.

Thanks for review, @dongjoon-hyun and @viirya

HyukjinKwon pushed a commit that referenced this pull request May 4, 2019
…ling

## What changes were proposed in this pull request?

In SPARK-27612, one correctness issue was reported. When protocol 4 is used to pickle Python objects, we found that unpickled objects were wrong. A temporary fix was proposed by not using highest protocol.

It was found that Opcodes.MEMOIZE was appeared in the opcodes in protocol 4. It is suspect to this issue.

A deeper dive found that Opcodes.MEMOIZE stores objects into internal map of Unpickler object. We use single Unpickler object to unpickle serialized Python bytes. Stored objects intervenes next round of unpickling, if the map is not cleared.

We has two options:

1. Continues to reuse Unpickler, but calls its close after each unpickling.
2. Not to reuse Unpickler and create new Unpickler object in each unpickling.

This patch takes option 1.

## How was this patch tested?

Passing the test added in SPARK-27612 (#24519).

Closes #24521 from viirya/SPARK-27629.

Authored-by: Liang-Chi Hsieh <[email protected]>
Signed-off-by: HyukjinKwon <[email protected]>
@dongjoon-hyun
Copy link
Member

dongjoon-hyun commented Jan 25, 2020

Hi, @HyukjinKwon .
This issue is marked as a correctness issue. Do we need to backport this?
Never mind. I confirm that this doesn't happen in 2.4.4.

@HyukjinKwon HyukjinKwon deleted the SPARK-27612 branch March 3, 2020 01:18
rshkv pushed a commit to palantir/spark that referenced this pull request May 23, 2020
…ling

In SPARK-27612, one correctness issue was reported. When protocol 4 is used to pickle Python objects, we found that unpickled objects were wrong. A temporary fix was proposed by not using highest protocol.

It was found that Opcodes.MEMOIZE was appeared in the opcodes in protocol 4. It is suspect to this issue.

A deeper dive found that Opcodes.MEMOIZE stores objects into internal map of Unpickler object. We use single Unpickler object to unpickle serialized Python bytes. Stored objects intervenes next round of unpickling, if the map is not cleared.

We has two options:

1. Continues to reuse Unpickler, but calls its close after each unpickling.
2. Not to reuse Unpickler and create new Unpickler object in each unpickling.

This patch takes option 1.

Passing the test added in SPARK-27612 (apache#24519).

Closes apache#24521 from viirya/SPARK-27629.

Authored-by: Liang-Chi Hsieh <[email protected]>
Signed-off-by: HyukjinKwon <[email protected]>
rshkv pushed a commit to palantir/spark that referenced this pull request Jun 5, 2020
…ling

In SPARK-27612, one correctness issue was reported. When protocol 4 is used to pickle Python objects, we found that unpickled objects were wrong. A temporary fix was proposed by not using highest protocol.

It was found that Opcodes.MEMOIZE was appeared in the opcodes in protocol 4. It is suspect to this issue.

A deeper dive found that Opcodes.MEMOIZE stores objects into internal map of Unpickler object. We use single Unpickler object to unpickle serialized Python bytes. Stored objects intervenes next round of unpickling, if the map is not cleared.

We has two options:

1. Continues to reuse Unpickler, but calls its close after each unpickling.
2. Not to reuse Unpickler and create new Unpickler object in each unpickling.

This patch takes option 1.

Passing the test added in SPARK-27612 (apache#24519).

Closes apache#24521 from viirya/SPARK-27629.

Authored-by: Liang-Chi Hsieh <[email protected]>
Signed-off-by: HyukjinKwon <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants