feat(jest-core): Add perfStats to surface test setup overhead#14622
feat(jest-core): Add perfStats to surface test setup overhead#14622SimenB merged 6 commits intojestjs:mainfrom
Conversation
✅ Deploy Preview for jestjs ready!Built without sensitive environment variables
To edit notification comments on pull requests, go to your Netlify site configuration. |
05f5d7e to
0291093
Compare
No problem! Thanks for taking a look! I originally considered the same about the performance marks as I am already leaning on those to observe the runtime of Jest, but I was concerned about the performance mark entries getting noisy as the perfStats metadata I introduced is per test file. So, I thought of trying to localize that metadata to a results object associated with the test file. With that said, I do not really love the names of the attributes, but it does make it clear what the metadata is. |
I think it's fine either way. I can see an argument to justify the difference that |
packages/jest-types/src/Config.ts
Outdated
| workerIdleMemoryLimit?: number; | ||
| }; | ||
|
|
||
| export type ProjectConfigPerfStats = { |
There was a problem hiding this comment.
I don't think this belongs in Config. Seems more like a TestResult thing
There was a problem hiding this comment.
Ah yea, good call!
Considering this just captures setupAfterEnv perfStats what are your thoughts on SetupAfterEnvPerfStats? I think the reason I probably avoided using TestResult here was because this is only a subset of the TestResult perfStats due to where I can capture this metadata during the lifecycle of test execution.
SimenB
left a comment
There was a problem hiding this comment.
Yeah, I buy that argument. Thanks!
Should we document these props anywhere? I guess we don't currently, and this PR probably isn't the place to add it.
Happy to add documentation or tackle that in a subsequent pull request. Do you have thoughts on where might be most appropriate? I'm seeing these areas as possible for contenders for more context: |
|
Ah, the first link has so I guess we should add the new props there. Maybe that's enough? |
That sounds good to me! I'll get the changes up shortly Edit: updated! |
b6fe351 to
6d59478
Compare
b8ed500 to
2a70b2b
Compare
|
👍 |
|
This pull request has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs. |
Summary
Introduces additional performance metadata to surface the overhead of test setup. Specifically, this aims to highlight the expense of
testEnvironment,setupFiles, andsetupFilesAfterEnv. The metadata currently available does not surface where test files spend their time outside of test caseduration, so these changes aim to highlight that context.The change primarily extends the
perfStatsobject of theTestResultstype as these lifecycle events occur at the test file level. Additionally, the change introducesstartedAtto theAssertionResult.durationis already exposed, butstartedAtwas not even though that attribute was already available onTestEntry. ExposingstartedAtprovides additional context on where the test ran relative to thestartandendtime of the file itself.Test plan
Run a simple test with a custom reporter for surfacing the new metadata:
Simple test
Custom reporter
New perfStats
New testResult