Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Tests] consider erroring tests on warnings #10376

Open
maxpatiiuk opened this issue Sep 23, 2024 · 1 comment
Open

[Tests] consider erroring tests on warnings #10376

maxpatiiuk opened this issue Sep 23, 2024 · 1 comment
Assignees
Labels
1 - assigned Issues that are assigned to a sprint and a team member. blocked This issue is blocked by another issue. estimate - 3 A day or two of work, likely requires updates to tests. p - low Issue is non core or affecting less that 10% of people using the library testing Issues related to automated or manual testing.
Milestone

Comments

@maxpatiiuk
Copy link
Member

Priority impact

p - low

Test type

E2E

Which Component(s)

Unstable Tests

At the moment, if console.error occurs during a component test, the test will fail:

afterEach(() => expect(globalError).not.toHaveBeenCalled());

In Lumina, the default behavior is to fail not just on console.error, but also on console.warn.

That behavior is chosen because generally components should not be emitting warnings - if they are, that might mean they are used incorrectly. If the test want to explicitly test incorrect/deprecated usage, then it can do so by explicitly mocking out console.warn and asserting that a given warning was triggered.

Options:

  1. Change Calcite's setupFile to fail on console.warn too. If test is expecting console.warn to be called, it can mock it as follows:

    const originalConsoleWarn = console.warn;
    
    beforeEach(() => {
      console.warn = jest.fn();  // or vi.fn() in Vitest (codemod will take care of this, so don't worry)
    });
    
    afterEach(() => {
      console.warn = originalConsoleWarn;
    });
    
    // optionally inside the test, you can asset that console.warn was called to make sure only expected warnings were produced
    • I see several tests are emitting warnings because deprecated components are rendered, or because properties are modified in componentDidUpdate triggering another re-render. Thus, you might want to create a small test util that would encapsulate the above beforeEach/afterEach calls and reuse that between the tests that emit console warnings.
  2. Keep the current behavior and overwrite default Lumina behavior. Changing the current behavior is not a requirement for migrating - it's just an opportunity to rethink the options.

Test error, if applicable

No response

PR skipped, if applicable

Additional Info

No response

@maxpatiiuk maxpatiiuk added testing Issues related to automated or manual testing. 0 - new New issues that need assignment. needs milestone Planning workflow - pending milestone assignment, has priority and/or estimate. labels Sep 23, 2024
@github-actions github-actions bot added the p - low Issue is non core or affecting less that 10% of people using the library label Sep 23, 2024
@jcfranco jcfranco added estimate - 3 A day or two of work, likely requires updates to tests. blocked This issue is blocked by another issue. 1 - assigned Issues that are assigned to a sprint and a team member. and removed 0 - new New issues that need assignment. labels Oct 9, 2024
@jcfranco
Copy link
Member

jcfranco commented Oct 9, 2024

Blocked by #10310.

@geospatialem geospatialem added this to the Stalled milestone Oct 9, 2024
@geospatialem geospatialem removed the needs milestone Planning workflow - pending milestone assignment, has priority and/or estimate. label Oct 9, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
1 - assigned Issues that are assigned to a sprint and a team member. blocked This issue is blocked by another issue. estimate - 3 A day or two of work, likely requires updates to tests. p - low Issue is non core or affecting less that 10% of people using the library testing Issues related to automated or manual testing.
Projects
None yet
Development

No branches or pull requests

3 participants