500 error after uploading rust source-based coverage results via llvm-cov


I’m trying to use the new source-based coverage reporting in rust with codecov and it looks like I got everything working, the coverage report gets generated and reported, but in the end when I try to navigate to the coverage results all I get is an http 500 error with little additional information about what is wrong.

Commit SHAs



A link to the repository in question

CI/CD or Build URL


Github actions uploader which seems to use the bash uploader

Codecov Output

Run codecov/codecov-action@v1
/usr/bin/bash codecov.sh -Q github-action -n  -F 

  _____          _
 / ____|        | |
| |     ___   __| | ___  ___ _____   __
| |    / _ \ / _` |/ _ \/ __/ _ \ \ / /
| |___| (_) | (_| |  __/ (_| (_) \ V /
 \_____\___/ \__,_|\___|\___\___/ \_/

==> GitHub Actions detected.
    project root: .
    Yaml found at: codecov.yml
==> Running gcov in . (disable via -X gcov)
==> Python coveragepy not found
==> Searching for coverage reports in:
    + .
    -> Found 2 reports
==> Detecting git/mercurial file structure
==> Reading reports
    + ./.github/workflows/coverage.yml bytes=1162
    + ./coverage.txt bytes=16374077
==> Appending adjustments
    -> No adjustments found
==> Gzipping contents
==> Uploading reports
    url: https://codecov.io
    query: branch=jane%2Fsource-coverage&commit=c62a091f9f0c38130663c8983547017b7fea3ebd&build=366901525&build_url=http%3A%2F%2Fgithub.com%2FZcashFoundation%2Fzebra%2Factions%2Fruns%2F366901525&name=&tag=&slug=ZcashFoundation%2Fzebra&service=github-actions&flags=&pr=1293&job=&cmd_args=Q,n,F
->  Pinging Codecov
->  Uploading to
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100 3218k    0     0  100 3218k      0  8605k --:--:-- --:--:-- --:--:-- 8605k
    -> View reports at https://codecov.io/github/ZcashFoundation/zebra/commit/fc861bbc5df8dc110b6fb38e8f7c2d19913f2c57

Expected Results

I’d have expected to either see an error indicating that the coverage format was incorrect or see the coverage results when I navigated to the coverage report.

Actual Results

Upload apparently excited successfully but the report was empty

I have the exact same issue doing a similar thing.

As well as the 500 error, there is a “400 - Invalid signature” error returned when github tries to send the webhook to codecov.

@yaahc, apologies would you be able to supply a newer commit SHA and Codecov upload output? We made some changes here.

@Diggsey, can you please provide some URLs and commit SHAs?

I opened a support ticket for this.

Got it, we’ll take a look there then.

tom I was able to test some more and now it seems that I can get upload to work so long as I convert the source based coverage results to the lcov output format with llvm-cov export but that format discards region based coverage information, which is the information I am trying to get to show up.

I’ve reproduced this on a smaller personal repository of mine so I can do quicker tests.

https://github.com/yaahc/btparse/tree/coverage-tests is the repo and branch

I’m running the tests via https://github.com/yaahc/btparse/blob/coverage-tests/test_coverage.sh:

#! /bin/bash
set -e
set -o xtrace

rm -rf ./target/
mkdir -p ./target/debug/coverage
DO_ET="fool" LLVM_PROFILE_FILE="${PWD}/target/debug/coverage/test.%p.profraw" RUSTFLAGS="-Zinstrument-coverage" cargo test
git add .
git commit -m "checkpoint"
git push
$(rustc --print target-libdir)/../bin/llvm-profdata merge --sparse ./target/debug/coverage/test.*.profraw -o ./target/test.profdata

# This one works and shows all the details I want in the CLI
$(rustc --print target-libdir)/../bin/llvm-cov show -Xdemangler=rustfilt -instr-profile=./target/test.profdata $(find target/debug/deps -type f -perm -u+x ! -name '*.so') -show-line-counts-or-regions -show-instantiations

# This one gives an error indicating that there was an error parsing the report
$(rustc --print target-libdir)/../bin/llvm-cov show -Xdemangler=rustfilt -instr-profile=./target/test.profdata $(find target/debug/deps -type f -perm -u+x ! -name '*.so') -show-line-counts-or-regions -show-instantiations > coverage.txt
mv coverage.txt ./target
# bash <(curl -s https://codecov.io/bash) -f coverage.txt -t <token>

$(rustc --print target-libdir)/../bin/llvm-cov export -format="lcov" -instr-profile=./target/test.profdata $(find target/debug/deps -type f -perm -u+x ! -name '*.so') > lcov.info
mv lcov.info ./target
# This one works, but discards region coverage results and shows the entire line as covered even if only part of it is
# bash <(curl -s https://codecov.io/bash) -f lcov.info -t <token>

# This one gives an error indicating that there was an error parsing the report
$(rustc --print target-libdir)/../bin/llvm-cov export -format="text" -instr-profile=./target/test.profdata $(find target/debug/deps -type f -perm -u+x ! -name '*.so') > coverage.json
mv coverage.json ./target
# bash <(curl -s https://codecov.io/bash) -f coverage.json -t <token>

Here’s an example of what I want to see

   93|       |fn deserialize_str(bt: &str) -> Result<Backtrace, Error> {
   94|      3|    let mut frames = vec![];
   95|      3|    let mut bt = deser::header(bt)?;
                      ^2                        ^1
   96|       |
   97|       |    loop {
   98|     29|        let (bt_next, frame) = deser::frame(bt)?;
   99|     29|        bt = bt_next;
  100|     29|        frames.push(frame);
  101|     29|
  102|     29|        let (bt_next, had_comma) = deser::trailing_comma(bt);
  103|     29|        bt = bt_next;
  104|       |
  105|     29|        if !had_comma {
  106|      2|            break;
  107|       |        }
  108|     27|    }
  109|       |
  110|      2|    let bt = deser::close_bracket(bt)?;
  111|       |
  112|      2|    if !bt.is_empty() {
  113|      0|        Err(Kind::UnexpectedInput(bt.into()))?;
  114|      2|    }
  115|       |
  116|      2|    Ok(Backtrace { frames })
  117|      3|}

And here’s what I’m actually seeing

And here’s an example of me trying to upload a json format output from llvm-cov export which does include region information

Great work getting an lcov export to work. I checked the docs for llvm-cov https://www.llvm.org/docs/CommandGuide/llvm-cov.html and it looks like the -show-line-counts-or-regions flag is only possible with the html and text outputs of show. While the CLI help shows lcov as a possible format to show, it appears to be replaced with export, which does not support the regions flag.

from the link you posted it seems like llvm-cov export should include region info when exporting as json

When exporting JSON, the regions, functions, expansions, and summaries of the coverage data will be exported. When exporting an lcov trace file, the line-based coverage and summaries will be exported.

If I remember correctly, the JSON that llvm-cov export creates is very … different and does not provide the information needed for Codecov to extract the information you need.

That said, it’s been at least a year since I experimented with these options. If you want to test and and report back, it may work better with rust then when I tested it with xcode coverage data.

I probably won’t do much more testing because I don’t really have any insight into what level of region based coverage codecov supports, so I’d just be taking shots in the dark.

That said, I did look up the format for the json that llvm-cov exports.

I’m not sure how stable this format is, it already seems to mismatch the actual copy I have locally, which has 6 fields in a Segment despite this source copy only seeming to produce 5, and there’s no real description what each value means.

I wouldn’t mind doing the extra work to reverse engineer this json format and convert it into some more stable well-known format, the main impediment is not having a known good region supporting format I can convert into that I know codecov.io will accept.

I’ve not seen region support before, I do not believe we support it, since I can find no mention of the word region in the processor codebase. I’d be happy to create a feature request for this, which would get prioritized according to interest, since I don’t know what would be involved with adding it.

Codecov supports this JSON format https://istanbul.js.org/docs/advanced/alternative-reporters/#json (Sorry, I could find a spec) and our own custom format https://docs.codecov.io/docs/codecov-custom-coverage-format. Note: We do not support json-summary, it is lacking the information we need.

Regarding stability, I know that Apple changes their xcode tools almost every version. I don’t know what the stability of the underlying llvm project is.

It looks like the format you posted above might match the other one, maybe give it a try and see if the processor accepts it? If you do, please share the commit SHA so I can see if there are any errors in the logs.

(cc: @ajbrown)

If by other one you mean the istanbul.js one, it doesn’t look like it matches.

Either way though, both of those coverage formats you linked seem to include column information or some level of linewise granularity, which is really good. I’m gonna try hand writing a coverage report based on those formats using some of the details from the real coverage report and see if I can get codecov to show parital coverage within a single line. If so I’ll probably just go ahead and throw together a quick converter from the llvm-cov export json format to the istanbul.js json format or w/e format works.

This still misses out on the expansion coverage though, which I don’t think I mentioned before. Another thing the source based coverage does is give individual coverage results for expansions of generic types.

So my guess is its still a good idea to create an issue to directly support the llvm source based coverage format, either the profdata itself or some format derived from it.

Oh actually, @drazisil, I remember seeing profdata references in the bash uploader for swift, that might indicate some level of pre-existing support for this coverage format, because this coverage option creates profraw data which we then convert into profdata and finally into the various formats I’ve tried uploading. Something to look into at least.