PR to add codecov badge drops coverage

See Add coverage badge by thomasrockhu · Pull Request #544 · substack/tape · GitHub - coverage should not have dropped in a PR that does nothing but update the readme.

1 Like

Thanks @ljharb this got lost from my queue. I’ll take a look why this week.

@ljharb three things come to mind right now

  1. A significant number of commits are stuck in a processing state. We’ve made a fix recently for this.
  2. Some commits are not found. This is typically due to rebasing, but I think you need to add fetch-depth: 2 (or anything > 1 or 0) to the actions/checkout step
  3. There are a lot of uploads here per commit. We recently started enforcing approximately a 100-upload limit. I don’t know if you are going to hit this (if you do, Codecov will respond with an error), but I wanted to make sure you knew about it.

Re 1, thanks!

Re 2, i rebase and force push constantly; I’m not sure why I’d need to alter the way checkouts are done. If the sha isn’t available I’d just expect the job to fail.

Re 3, i run 200-400 jobs per run, in 200-300 projects, all with coverage data, so that limit will effectively eliminate my ability to use codecov whatsoever. Is there any workaround?

@ljharb

  1. Absolutely
  2. That shouldn’t be an issue. The problem is that actions/checkout creates a new commit SHA. We use fetch-depth: 2 to grab the true SHA
  3. Woof, ok. I don’t have a great solution for you right now. The best I can think of is to aggregate some of your uploads together. Could you describe your CI pipeline a little bit here? Maybe we can figure out a good solution.

I can certainly add fetch-depth 2, but i’ve not had to do that on hundreds of other repos so why would i need to do that now?

As for number 3, I’m not sure how to do that. I test on every minor version of node from 0.6 on, on most projects, which amounts to around 218 jobs (a number that increases by 1-3 every time node does a release).

What’s the point of limiting it to 100 uploads? If I condense 218 jobs down to a handful of uploads, using https://npmjs.com/istanbul-merge or something, you still have to process roughly the same amount of coverage data - what are you optimizing for with this arbitrary, and not announced at all, limit?