Opened 7 years ago
Closed 7 years ago
#4948 closed defect (fixed)
Issues with verification reporting on Hudson
Reported by: | Francesco Casella | Owned by: | Adrian Pop |
---|---|---|---|
Priority: | critical | Milestone: | 1.13.0 |
Component: | Testing Framework | Version: | |
Keywords: | Cc: |
Description
Please check the report of the ScalableTestSuite Hudson test. For some models, e.g. SimpleAdvection_N_XXX
, the smaller ones pass the verification successfully, while for the larger ones the verification result cell is brown.
The reference files have been created using the same logic for all sizes, so I don't think there is something wrong with the reference files (though it could still be).
It doesn't seem to be a matter of number of variables, either. If you check SimpleAdvection_N_XXX
, the first two cases have the same number of checked signals, despite their different size. On the other hand, in this class of models the number of simulation intervals grows with N, so maybe the problem is that some datafiles have too many time points (not signals). In some cases, you really need a lot of detail, so if such a limit exists, I would suggest to remove it, or at least to set it to a much higher value than the current one.
The other issue is that some (not all, see e.g. this link) of the models that fail the verification have broken links to the javascript and csv files that demonstrate the mismatch, see, e.g., this link.
Change History (2)
comment:1 by , 7 years ago
comment:2 by , 7 years ago
Resolution: | → fixed |
---|---|
Status: | new → closed |
The second issue is better described in #4980
The first issue is actually better described in #4961