-
Notifications
You must be signed in to change notification settings - Fork 28.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-51691][CORE][TESTS] SerializationDebugger should swallow exception when try to find the reason of serialization problem #50489
base: master
Are you sure you want to change the base?
Conversation
…en try to find the reason of serialization problem
} catch { | ||
case _: SparkRuntimeException => "cannot print object" | ||
} | ||
val elem = s"object (class ${s.getClass.getName}, $str)" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it possible to add a new test case? Or which existing test case covers this scenario?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add test SPARK-51691 improveException swallow underlying exception
to cover this scenario
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can we just add a branch like case s: Object with java.io.Serializable if Utils.isTesting =>
@@ -205,6 +214,12 @@ class SerializableClass1 extends Serializable | |||
|
|||
class SerializableClass2(val objectField: Object) extends Serializable | |||
|
|||
class SerializableClassWithStringException(val objectField: Object) extends Serializable { | |||
override def toString: String = { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This simulate the behavior of toString
implementation of TreeNode
, since SQLConf.get
may throw exception
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
spark/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
Lines 228 to 229 in 554d678
} else if (Utils.isTesting) { | |
throw QueryExecutionErrors.cannotGetSQLConfInSchedulerEventLoopThreadError() |
So is this an issue that can only occur during testing?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, it only throw exception in during test
core/src/main/scala/org/apache/spark/serializer/SerializationDebugger.scala
Outdated
Show resolved
Hide resolved
@summaryzb Could you update the description of the pr to clarify that the changes in this PR are intended to present a clearer exception stack hierarchy in the tests? (Is my understanding correct?) |
Update the title and description to clarify that this is a unit test issue |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1, LGTM
Let's wait for another 48 hours, If no one has any other comments, I will merge it.
LGTM |
What changes were proposed in this pull request?
Catch
SparkRuntimeException
when deep into serialization exception stack during unit test.Why are the changes needed?
Present a clearer serialization exception stack hierarchy during test,
toString
implementation ofTreeNode
may throw exception sinceSQLConf.get
check. It is helpful for debug the real problemDoes this PR introduce any user-facing change?
Yes, but it only take effect in unit test.
User will see the direct serialization exception and the reference chain beyond the root cause.
Before this pr, user will get confuse when unrelated exception is shown
After this pr
How was this patch tested?
Pass GitHub Actions
Was this patch authored or co-authored using generative AI tooling?
No