Skip to content
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -376,8 +376,8 @@ class OracleIntegrationSuite extends DockerJDBCIntegrationSuite with SharedSQLCo
val e = intercept[org.apache.spark.SparkException] {
spark.read.jdbc(jdbcUrl, "tableWithCustomSchema", new Properties()).collect()
}
assert(e.getMessage.contains(
"requirement failed: Decimal precision 39 exceeds max precision 38"))
assert(e.getCause().isInstanceOf[ArithmeticException])
assert(e.getMessage.contains("Decimal precision 39 exceeds max precision 38"))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

as a very nit, I'd rather check the exception type, which I think is more important than the exact message. Now we should be coherent in the whole codebase and always throw an ArithmeticException, while previously we were sometimes throwing RuntimeException or others for the same case.

Copy link
Member Author

@dongjoon-hyun dongjoon-hyun Jul 15, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for review, @mgaido91 .

Sure, of course, we can check the underlying exception type by e.getCause from SparkException additionally. I'll add that.

BTW, message checking is a more fine-grained verification. As you know, ArithmeticException and ParseException are not specific. For example, ArithmeticException can be caused by divide by zero. We should check the error message always.


// custom schema can read data
val props = new Properties()
Expand Down