Skip to content

Commit 0bac3e4

Browse files
ymwdalexsrowen
authored andcommitted
[SPARK-19797][DOC] ML pipeline document correction
## What changes were proposed in this pull request? Description about pipeline in this paragraph is incorrect https://spark.apache.org/docs/latest/ml-pipeline.html#how-it-works > If the Pipeline had more **stages**, it would call the LogisticRegressionModel’s transform() method on the DataFrame before passing the DataFrame to the next stage. Reason: Transformer could also be a stage. But only another Estimator will invoke an transform call and pass the data to next stage. The description in the document misleads ML pipeline users. ## How was this patch tested? This is a tiny modification of **docs/ml-pipelines.md**. I jekyll build the modification and check the compiled document. Author: Zhe Sun <ymwdalex@gmail.com> Closes #17137 from ymwdalex/SPARK-19797-ML-pipeline-document-correction.
1 parent fa50143 commit 0bac3e4

1 file changed

Lines changed: 1 addition & 1 deletion

File tree

docs/ml-pipeline.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -132,7 +132,7 @@ The `Pipeline.fit()` method is called on the original `DataFrame`, which has raw
132132
The `Tokenizer.transform()` method splits the raw text documents into words, adding a new column with words to the `DataFrame`.
133133
The `HashingTF.transform()` method converts the words column into feature vectors, adding a new column with those vectors to the `DataFrame`.
134134
Now, since `LogisticRegression` is an `Estimator`, the `Pipeline` first calls `LogisticRegression.fit()` to produce a `LogisticRegressionModel`.
135-
If the `Pipeline` had more stages, it would call the `LogisticRegressionModel`'s `transform()`
135+
If the `Pipeline` had more `Estimator`s, it would call the `LogisticRegressionModel`'s `transform()`
136136
method on the `DataFrame` before passing the `DataFrame` to the next stage.
137137

138138
A `Pipeline` is an `Estimator`.

0 commit comments

Comments
 (0)