Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support failing a test if no assertions are ran #2209

Closed
orta opened this issue Dec 2, 2016 · 35 comments
Closed

Support failing a test if no assertions are ran #2209

orta opened this issue Dec 2, 2016 · 35 comments

Comments

@orta
Copy link
Member

orta commented Dec 2, 2016

Do you want to request a feature or report a bug?

Feature

What is the current behavior?

A bad translation from mocha -> Jest had tests that looked like this:

describe('File Parsing for expects', () => {
  it('For a danger test file', async () => {
    const parser = new TestFileParser();
    await parser.run(`${fixtures}/dangerjs/travis-ci.example`);
    expect(parser.expects.length, 2);
  });
});

Which passes, but shouldn't.

This can also happen with promises and any async code being tested.

If the current behavior is a bug, please provide the steps to reproduce and if possible a minimal repository on GitHub that we can npm install and npm test.

What is the expected behavior?

Offer a config variable that fails your test if no assertions are used during the test run. IMO default to on too.

Run Jest again with --debug and provide the full configuration it prints. Please mention your node and npm version and operating system.

N/A - but was running on master, node 7, npm3.

@cpojer
Copy link
Member

cpojer commented Dec 2, 2016

Jest 17.1 (today?) will support expect.assertions(number). Does that work for you?

@orta
Copy link
Member Author

orta commented Dec 2, 2016

That is an awesome - I'd like every it to check that number to be > 0 at the end of the test.

// Should fail because no assertions
it("does nothing", () => {
})

// could encourage this for stubs, which can pass
it("does nothing")

/// should fail because no assertions happen during the test
const fs = require("fs")
it("does something with the file", () => {
  fs.readFile("file.js", (err, data) => {
    expect(data).toBeTruthy()
  })
})

@cpojer
Copy link
Member

cpojer commented Dec 2, 2016

Hmm, this makes sense but we do have a legit use case here of making sure something doesn't fail so I don't wanna make it more restrictive.

it('does not throw', () => {
  RenderMyApp();
});

@jwbay
Copy link
Contributor

jwbay commented Dec 2, 2016

There's expect(() => RenderMyApp()).not.toThrow() right? It's verbose but not sure how common that kind of thing is. Ensuring each test makes an assertion, even if that behavior is behind a flag, would be really valuable for async-heavy codebases.

@cpojer
Copy link
Member

cpojer commented Dec 3, 2016

The default test that create-react-app ships with doesn't have an assertion: https://github.com/facebookincubator/create-react-app/blob/master/packages/react-scripts/template/src/App.test.js#L7

Jest also doesn't enforce usage of expect. You can use any assertion library you want as long as it throws on failure.

cc @gaearon in case he has an opinion here.

@SimenB
Copy link
Member

SimenB commented Dec 3, 2016

I use power-assert instead of expect, so auto-counting and throwing would make me 😭

But maybe some work can be put into https://www.npmjs.com/package/eslint-plugin-jest getting it closer to https://www.npmjs.com/package/eslint-plugin-ava, which has a lot of assertions for making sure you use the API properly?

(Started migrating to jest today, and the only thing I'm really missing from Ava is the Eslint plugin (and no globals, but that is more of a style, it doesn't really matter))

@cpojer
Copy link
Member

cpojer commented Dec 3, 2016

Yeah, I agree. I think between expect.assertions and an eslint plugin, we should be able to cover this use case without making Jest more restrictive.

I didn't realize there was an eslint plugin for Jest. We should consider merging this into Jest, making it official and expanding it's scope a lot; including an optional option to make sure "expect" is used at least once within a test. @SimenB are you interested in working on this? @jkimbo would you be interested to work more on this eslint plugin and helping to make it official?

@SimenB
Copy link
Member

SimenB commented Dec 3, 2016

I'll happily contribute to the eslint plugin!

@cpojer
Copy link
Member

cpojer commented Dec 3, 2016

That sounds great. To be honest, I haven't considered this myself before, so I'm gonna rely on ideas from you guys. What do you want it to look like and lint against? I can help figure out what the right defaults should be, though :) I care most about finding patterns that guide people to write tests with good quality and rules that are helpful rather than annoying.

cc @zertosh who always has opinions on linting.

@jkimbo
Copy link
Contributor

jkimbo commented Dec 3, 2016

@cpojer I am very happy in making it more official! Anything I can do to help let me know and always happy to accept pull requests.

@cpojer
Copy link
Member

cpojer commented Dec 3, 2016

That's great! Would you be willing to move it into the "packages" folder of the Jest repo and add me to the project on npm?

@orta
Copy link
Member Author

orta commented Dec 4, 2016

This thread went perfectly 👍

@jkimbo
Copy link
Contributor

jkimbo commented Dec 4, 2016

@cpojer I've created a pull request for it: #2214 and I've added you to the npm package.

@cpojer
Copy link
Member

cpojer commented Dec 4, 2016

Awesome!

@zertosh
Copy link
Contributor

zertosh commented Dec 5, 2016

One thing I really like about tap, is that you must either use t.end() to signify that your test is over, or t.plan(/*number of assertions*/).

@SimenB
Copy link
Member

SimenB commented Dec 5, 2016

That's one of the things I dislike about tap... 😆 end should only be needed in callbacks, plan should really only be used if you assert in catch clauses or loops for instance. IMO it just adds noise to the test, making me read more to see what's actually tested

@cpojer
Copy link
Member

cpojer commented Feb 25, 2017

All done!

@cpojer cpojer closed this as completed Feb 25, 2017
@SimenB
Copy link
Member

SimenB commented Mar 6, 2017

To get back to this, it could be checked statically as well. https://github.com/tlvince/eslint-plugin-jasmine/blob/master/docs/rules/missing-expect.md

The issue in the OP is mitigated by #3067 though 😄

@GantMan
Copy link
Contributor

GantMan commented May 16, 2017

@cpojer - would you accept a PR adding minAssertions?

I use expect.assertions(number) in my test helper, but I can't guarantee they've added more inside the test itself. I'd be happy if the number were larger, but not lower than the amount dictated in my helper function.

@cpojer
Copy link
Member

cpojer commented May 16, 2017

No, Jest 20 has expect.hasAssertions() for this purpose, but no minimum.

@tonyxiao
Copy link

Is there a way to make every it statement automatically call expect.hasAssertions()? Of course it can be optional so it doesn't impact every jest user.

@SimenB
Copy link
Member

SimenB commented Jan 20, 2018

No. You can use the lint rule to ensure all tests have it

@tonyxiao
Copy link

tonyxiao commented Jan 20, 2018

Actually calling expect.hasAssertions() in afterEach seems to do the trick. Is this expected? That one could call assertions in before / after each statements? If so, can assertions be called in beforeAll and afterAll statements?
image
image

@SimenB
Copy link
Member

SimenB commented Jan 20, 2018

That's expected, but not on purpose (same thing as expect working outside of tests). Might break at any time, not part of the api

@tonyxiao
Copy link

Got it. It would be nice if a capability like this were to be part of the API. I genuinely want to assert that at least one assertion is called afterEach test, and this seems like a perfectly valid use-case.

@alloy
Copy link

alloy commented Jan 29, 2019

One issue with the afterEach(() => expect.hasAssertions()) approach is that when an exception gets thrown in a test you’d really only want that exception being reported, but because afterEach and beforeEach callbacks run regardless of exceptions being thrown in tests you’d get additional unhelpful noise about no assertions having been made.

For instance, given a suite like the following:

describe("concerning assertions being made", () => {
  afterEach(() => expect.hasAssertions())

  it("fails when no assertions are made", () => {})

  it("does not check assertions are made when an exception is thrown", () => {
    throw new Error("only care about this error")
    expect(true).toEqual(true)
  })
})

The output looks like:

$ jest napkin.test.ts
  concerning assertions being made
    ✕ fails when no assertions are made (13ms)
    ✕ does not check assertions are made when an exception is thrown (1ms)

  ● concerning assertions being made › fails when no assertions are made

    expect.hasAssertions()

    Expected at least one assertion to be called but received none.

  ● concerning assertions being made › does not check assertions are made when an exception is thrown

    only care about this error

  ● concerning assertions being made › does not check assertions are made when an exception is thrown

    expect.hasAssertions()

    Expected at least one assertion to be called but received none.

The last reported error about missing assertions is unnecessarily noisy.


It would be nice to be able to do something like the following:

afterEach(spec => !spec.hasFailed && expect.hasAssertions())

Although I do understand wanting to keep details about the currently running spec hidden to the test file, so perhaps something like this could be considered?

afterEachSuccess(() => expect.hasAssertions())

@alloy
Copy link

alloy commented Jan 29, 2019

Actually, the following situation also leads to this noisy reporting, so this is a more general problem with hasAssertions. Is this something that you’d consider a UX issue?

  it("does not check assertions are made when an exception is thrown", () => {
    expect.hasAssertions()
    return new Promise(() => {
      throw new Error("only care about this error")
    }).then(() => expect(true).toEqual(true))
  })

@thymikee
Copy link
Collaborator

wdyt @SimenB?

@SimenB
Copy link
Member

SimenB commented Jan 30, 2019

We should probably count a failing assertion as an assertion before rethrowing it, or something like that

@ballercat
Copy link

Is there any interest from the Jest maintainers for a failWithoutAssertions configuration to be added to the main config? A runtime check for missing assertions is very useful. Specifically for misconfigured asynchronous tests, where an assertion runs after the test has completed. This is a common mistake made by folks new to async testing.

It seems like an optional setting in the config should help those of us who could really benefit from it. Working with large teams, makes it kind of difficult to enforce that everyone uses expect.assertions and the lint rule is not good enough for the example in the issue itself.

More than happy to help if that change would be welcomed.

@thymikee
Copy link
Collaborator

Wouldn't it fit better as an eslint rule?

@jeysal
Copy link
Contributor

jeysal commented Mar 20, 2019

Probably yes (although there may be some cases where lint doesn't quite get it right, such as using a helper that does the expect). I'd be cautious to add more coupling between test runner and assertion library, I think we already have too much of that.

@ballercat
Copy link

The static analysis answer is popular in this issue report, but it's missing the forrest for the trees. The crux of the problem Is that static analysis via ESLint is sub-par to catch runtime issues.

As an example, the documented "correct" use of expect by the jest/expect-expect plugin demonstrates what I'm talking about.

it('should work with callbacks/async', () => {
  somePromise().then(res => expect(res).toBe('passed'));
});

Yes, there is an expect here, but this expect should never execute as the promise is not returned. I/we use these rules all the time, but they do not catch logic issues with non-trivial tests. Which is why we need a broader runtime solution.

If the answer to a global opt-in for this is a plain "no" then it is what is it is. What would be an alternative option? There are test runners which provide this config (ava) but switching is not an option. I've been looking into setupFilesAfterEnv is that a possible solution for extending the functionality to support this globally? Maybe via a plugin of some kind?

From where I'm standing it seems like we just need a way to assert that a test finished executing and did not trigger assertions and fail that test. But again, without an explicit expect.assertions() etc in each test. Should be do-able.

@mpacary
Copy link

mpacary commented Mar 29, 2019

[EDIT : removed noise]
@ballercat I did not read the entire thread before posting my message, sorry for missing the point.

@ballercat
Copy link

ballercat commented Apr 18, 2019

Following up on this one more time.

@mpacary thanks for following, but this isn't what I need.

After much trial and error, and debugging some unsavory bugs I have fallen back to patching the Jest API to do the thing I wanted (and what the issue is asking for). As I speculated before, I had to add a "plugin" package of sorts to perform runtime checks for missing assertions in tests. See the plugin here - https://github.com/ballercat/jest-plugin-must-assert

I pointed that thing at a repo with 10k+ tests and found some interesting results. For example, any conditional assertions are dangerous(async or not) and result in false positives. For example

collection.forEach(value => {
  /* expectation on value */
});

... is a super common mistake to make. It's a huge bummer too, since no async logic is even involved. If the collection is empty no assertion run, but the test passes.

My final two cents:

Some of the hoops one has to jump through to get this sort of feedback from Jest/test-runner is a bit extreme. I had to for example, use the Zone.js library to ensure Promises/setTimeouts did not register assertions after a test has completed.

I would love for the Jest team to consider some way to improve the integration of assertion library with the actual test runner. The global nature of except is kind of a mess. I think that might be the byproduct of building on top of jasmine to begin with, but whatever. Honestly if expect was a callback to the test themselves this would be trivial to implement/support (eg. avajs).

Thanks for everyones help!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests