-
Notifications
You must be signed in to change notification settings - Fork 3.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add test coverage #1896
Add test coverage #1896
Conversation
Here is a demo after code coverage is activated zhujinxuan#4. Here is what we need we do before merging this PR:
|
Codecov Report
@@ Coverage Diff @@
## master #1896 +/- ##
=========================================
Coverage ? 65.94%
=========================================
Files ? 79
Lines ? 5436
Branches ? 1354
=========================================
Hits ? 3585
Misses ? 1388
Partials ? 463
Continue to review full report at Codecov.
|
lots to improve coverage wise :) |
Rerun the travis CI by reopen |
Fix file conflicts~ |
Hi, @ianstormtaylor . The conflict of this PR is resolved after #2021. I make a small dev-only change on KeyUtils and dev-fixtures because |
Hey @zhujinxuan, thank you for all your help here researching how to get code coverage reports. I'm sorry for my poor communication on this pull request. I really like the goal of having code coverage, but I'm just hesitant about switching from Mocha to Jest for lots of small reasons that add up... Diffs still aren't as goodThere are now extra Filtering tests isn't as goodWhereas Mocha has the Assertions aren't as goodSince Jest doesn't support the default Configuration isn't as goodJest requires more configuration that's going to be more to maintain over time to figure out why Bailing early doesn't workThere's a Jest CLI flag called Others feel similarlyI was thinking about this yesterday when I say Tim Oxley tweet about his troubles with Jest, that are some of the same "buildup of small issues" type worries I have. All that to say, that's why I haven't merged this sooner. I'm just slightly worried about it. That's not to say I won't merge it... I'd just like to be more sure. Why is it that this is currently not possible with Mocha? Why do we need to switch to Jest? From what I can tell, Coveralls (or others) work with Mocha too? |
I will have a try on mocha this weekend. I forgot the jest thing. When I
was beginning with this PR, I knew little about coverage workflow. It was
easier to do it with jest. I will try to switch back to mocha this weekend.
…On Fri, Aug 3, 2018 at 16:42 Ian Storm Taylor ***@***.***> wrote:
Hey @zhujinxuan <https://github.com/zhujinxuan>, thank you for all your
help here researching how to get code coverage reports. I'm sorry for my
poor communication on this pull request. I really like the goal of having
code coverage, but I'm just hesitant about switching from Mocha to Jest for
lots of small reasons that add up...
Diffs still aren't as good
[image: image]
<https://user-images.githubusercontent.com/311752/43664631-67d2d006-9722-11e8-8a27-e1bf68ce05f3.png>
[image: image]
<https://user-images.githubusercontent.com/311752/43663682-48904398-971f-11e8-9bdd-1f31918444ea.png>
There are now extra Object Array prefixes all throughout the diffs when
looking at the JSON output of Slate. It's a small thing, but it's just a
lot of noise.
Filtering tests isn't as good
[image: image]
<https://user-images.githubusercontent.com/311752/43663728-6d69e0b6-971f-11e8-9307-68a6282b2aa8.png>
Whereas Mocha has the --fgrep option, jest has -t but every suite that it
doesn't find a match for fails, and then it exists with a 1 error code.
This isn't that nice because it's the main way to debug individual tests.
Assertions aren't as good
[image: image]
<https://user-images.githubusercontent.com/311752/43663836-b5f3d09e-971f-11e8-8ae5-0463bfdb829f.png>
Since Jest doesn't support the default assert module properly, we have to
hack around it with the jest-t-assert module, but it doesn't have the
exact same API as assert. Again it's a small thing, but just extra
confusion.
Configuration isn't as good
[image: image]
<https://user-images.githubusercontent.com/311752/43664046-51b560f6-9720-11e8-8b60-db3e5548bd28.png>
Jest requires more configuration that's going to be more to maintain over
time to figure out why lerna-aliases isn't working, or whatever it ends
up being. Mocha by contrast was very simple, just point it at an index.js
file.
Bailing early doesn't work
There's a Jest CLI flag called --bail, but it doesn't actually bail out
of the tests early like Mocha does... it appears to be broken?
Others feel similarly
I was thinking about this yesterday when I say Tim Oxley tweet
<https://twitter.com/secoif/status/1025303928705961985> about his
troubles with Jest, that are some of the same "buildup of small issues"
type worries I have.
------------------------------
All that to say, that's why I haven't merged this sooner. I'm just
slightly worried about it. That's not to say I won't merge it... I'd just
like to be more sure.
Why is it that this is currently not possible with Mocha? Why do we need
to switch to Jest? From what I can tell, Coveralls (or others) work with
Mocha too?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#1896 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAtirx4GdohNHiMOXCV4yZYZo0XpI7oAks5uNLWggaJpZM4Um0nd>
.
|
@zhujinxuan thank you, I really appreciate it. I totally understand that feeling of forgetting why in the first place. Thank you for all of your help! |
Codecov tests will be moved to PR #2037. Close this PR. |
Is this adding or improving a feature or fixing a bug?
feature
yarn build
for test. We only build when lint code.yarn lint
withyarn lint:code
andyarn lint:document
then it can be run faster. And it is less confusing when contributors see an error bylint:document
than justlint
What's the new behavior?
Sometimes we make careless mistakes just by forgetting to test, or decide to add tests later but forgot. (for example, PR #1864 is an example that a test is missing)
Code analysis can help us eliminate these problems, and suggests us which tests are perhaps missing. And new PR contributors can look at the code coverage difference by code.
How does this change work?
Add codeCov as code coverage service. The following is an example about which files/lines are tested and which are not yet tested. When submitting PR, we can check which newly added lines are not yet tested.
For example
Have you checked that...?
yarn test
.yarn lint
. (Fix errors withyarn prettier
.)yarn watch
.)Does this fix any issues or need any specific reviewers?
Fixes: #1890
Reviewers: @