Skip to content

596 add code coverage#600

Merged
AlexanderFengler merged 12 commits intomainfrom
596-add-code-coverage
Nov 25, 2024
Merged

596 add code coverage#600
AlexanderFengler merged 12 commits intomainfrom
596-add-code-coverage

Conversation

@cpaniaguam
Copy link
Copy Markdown
Collaborator

@cpaniaguam cpaniaguam commented Oct 30, 2024

@cpaniaguam cpaniaguam linked an issue Oct 30, 2024 that may be closed by this pull request
@codecov
Copy link
Copy Markdown

codecov bot commented Nov 21, 2024

Welcome to Codecov 🎉

Once you merge this PR into your default branch, you're all set! Codecov will compare coverage reports and display results in all future pull requests.

ℹ️ You can also turn on project coverage checks and project coverage reporting on Pull Request comment

Thanks for integrating Codecov - We've got you covered ☂️

@cpaniaguam cpaniaguam force-pushed the 596-add-code-coverage branch from 4a279d8 to 1e65673 Compare November 21, 2024 17:46
Copy link
Copy Markdown
Collaborator

@digicosmos86 digicosmos86 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you! I am soooooo happy with the coverage of our code! I just mentioned in the comment that we have moved away from using conda as our test environment. Can you take a look at main?

@cpaniaguam cpaniaguam force-pushed the 596-add-code-coverage branch from 0bc29ac to 3fca065 Compare November 22, 2024 15:15
Copy link
Copy Markdown
Collaborator

@digicosmos86 digicosmos86 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM now

run_tests:
runs-on: ubuntu-latest
if: ${{ ! contains(github.event.head_commit.message, '[skip slow tests]') }}
if: ${{ ! contains(github.event.head_commit.message, '[skip fast tests]') }}
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@digicosmos86 I am not sure this is needed. Can this line be dropped?

Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So the setup is now to run all tests? Can codecov collect test results from separate runs? In principle I'd want fast tests to run so the tests fail fast

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@digicosmos86

Can codecov collect test results from separate runs?

It seems like the tooling for that is not reliable -- see https://stackoverflow.com/questions/73688076/pytest-with-cov-append-flag-overwrites-coverage-file. Besides, we are using xdist to run the tests in parallel. We could mark the slow tests but it's not clear to me that would matter to xdist. We can have a fail fast strategy with the --exitfirst flag, however. I'll add that.

Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's great but how about the order of execution? Would fast tests be run before slow tests? Is there a way to determine that?

@AlexanderFengler
Copy link
Copy Markdown
Member

@digicosmos86 the approval here is after the last change? Or another review needed?

@digicosmos86
Copy link
Copy Markdown
Collaborator

@digicosmos86 the approval here is after the last change? Or another review needed?

We need to figure out the fast and slow tests situation. For now let's discuss this a bit more

@cpaniaguam
Copy link
Copy Markdown
Collaborator Author

@digicosmos86 the approval here is after the last change? Or another review needed?

We need to figure out the fast and slow tests situation. For now let's discuss this a bit more

I could try to run the tests in succession and see if the tool for combining coverage reports works for us. However, please keep in mind that the tests are run in parallel, most of them are not slow, and there is a fail-fast strategy in place already.

@digicosmos86
Copy link
Copy Markdown
Collaborator

@digicosmos86 the approval here is after the last change? Or another review needed?

We need to figure out the fast and slow tests situation. For now let's discuss this a bit more

I could try to run the tests in succession and see if the tool for combining coverage reports works for us. However, please keep in mind that the tests are run in parallel, most of them are not slow, and there is a fail-fast strategy in place already.

Yeah the parallel tests does complicate things a bit, but even though parallelization may introduce some randomness, I think it's still possible to figure out the order in which tests are scheduled to run, which is connected to how pytest searches and collects tests. There should be some documentation on that. I don't mind restructuring (i.e. put all fast tests in one separate directory) the tests a bit so that fast tests are run before the hundreds of slow ones

@cpaniaguam
Copy link
Copy Markdown
Collaborator Author

cpaniaguam commented Nov 25, 2024

@digicosmos86 the approval here is after the last change? Or another review needed?

We need to figure out the fast and slow tests situation. For now let's discuss this a bit more

I could try to run the tests in succession and see if the tool for combining coverage reports works for us. However, please keep in mind that the tests are run in parallel, most of them are not slow, and there is a fail-fast strategy in place already.

Yeah the parallel tests does complicate things a bit, but even though parallelization may introduce some randomness, I think it's still possible to figure out the order in which tests are scheduled to run, which is connected to how pytest searches and collects tests. There should be some documentation on that. I don't mind restructuring (i.e. put all fast tests in one separate directory) the tests a bit so that fast tests are run before the hundreds of slow ones

@digicosmos86 Please see this and the changes to tests/conftest.py.

@digicosmos86
Copy link
Copy Markdown
Collaborator

@digicosmos86 the approval here is after the last change? Or another review needed?

We need to figure out the fast and slow tests situation. For now let's discuss this a bit more

I could try to run the tests in succession and see if the tool for combining coverage reports works for us. However, please keep in mind that the tests are run in parallel, most of them are not slow, and there is a fail-fast strategy in place already.

Yeah the parallel tests does complicate things a bit, but even though parallelization may introduce some randomness, I think it's still possible to figure out the order in which tests are scheduled to run, which is connected to how pytest searches and collects tests. There should be some documentation on that. I don't mind restructuring (i.e. put all fast tests in one separate directory) the tests a bit so that fast tests are run before the hundreds of slow ones

@digicosmos86 Please see this and the changes to tests/conftest.py.

Ah got it! You did exactly that. Cool then I am OK with merging. @AlexanderFengler would you like to give it another look?

@digicosmos86 digicosmos86 deleted the 596-add-code-coverage branch February 10, 2026 15:56
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add code coverage

3 participants