Preview

Proceedings of the Institute for System Programming of the RAS (Proceedings of ISP RAS)

Advanced search

Expert Assessment of Verification Tool Results

https://doi.org/10.15514/ISPRAS-2020-32(5)-1

Abstract

Verification tools can produce various kinds of results while checking programs against requirement specifications. Experts, who seek for errors and estimate completeness of verification, mostly appreciate verdicts, violation witnesses and code coverage reports. They need convenient tools for automating the assessment of verification results to apply verification tools in practice when many program configurations and versions are checked against various requirements. In this paper, we propose new methods for expert evaluation of verification results, covering all those problems that are most significant in accordance with our experience in verifying large programs for compliance with a large number of requirements specifications. Some ideas are borrowed from the areas of testing and static analysis. However, specific methods and technical solutions are unique, since the verification results provided by verification tools are either not found in other areas or have special semantics. The paper presents our approaches and their implementation in the Klever software verification framework.

About the Authors

Vladimir Anatolyevich GRATINSKIY
Ivannikov Institute for System Programming of the RAS
Russian Federation
Software developer at the Software Engineering Department


Evgeny Mikhailovich NOVIKOV
Ivannikov Institute for System Programming of the RAS
Russian Federation
PhD, Senior Researcher and Software Project Manager at the Software Engineering Department


Ilja Sergeevich ZAKHAROV
Ivannikov Institute for System Programming of the RAS
Russian Federation
Ph.D., Junior Researcher at the Software Engineering Department


References

1. R. Jhala and R. Majumdar. Software model checking. ACM Computing Surveys, vol. 41, issue 4, 2009, article 21, 54 p.

2. D. Beyer. Advances in Automatic Software Verification: SV-COMP 2020”. Lecture Notes in Computer Science, vol. 12079, 2020, pp. 347-367.

3. Definitions and Rules of the 10th International Competition on Software Verification. URL: https://sv-comp.sosy-lab.org/2021/rules.php, accessed 27.11.2020.

4. E. Novikov and I. Zakharov. Towards automated static verification of GNU C programs. Lecture Notes in Computer Science, vol. 10742, 2018, pp. 402-416.

5. D. Beyer, M. Dangl, D. Dietsch, M. Heizmann, and A. Stahlbauer. Witness validation and stepwise testification across software verifiers. In Proc. of the 10th Joint Meeting on Foundations of Software Engineering, 2015, pp. 721-733.

6. D. Beyer, M. Dangl, D. Dietsch, and M. Heizmann. Correctness witnesses: exchanging verification results between verifiers. In Proc. of the 24th ACM SIGSOFT International Symposium on Foundations of Software Engineering, 2016, pp. 326-337.

7. Static Driver Verifier Report. URL: https://docs.microsoft.com/en-us/windows-hardware/drivers/devtest/static-driver-verifier-report, accessed 27.11.2020.

8. D. Beyer and M. Dangl. Verification-aided debugging: An interactive web-service for exploring error witnesses. Lecture Notes in Computer Science, vol. 9780, 2016, pp. 502-509.

9. Clade. URL: https://github.com/17451k/clade, accessed 27.11.2020.

10. G.C. Necula, S. McPeak, S.P. Rahul, W. Weimer. CIL: Intermediate Language and Tools for Analysis and Transformation of C Programs. Lecture Notes in Computer Science, vol. 2304, 2002, pp. 213-228.

11. Exchange Format for Violation Witnesses and Correctness Witnesses. URL: https://github.com/sosy-lab/sv-witnesses, accessed 27.11.2020.

12. Klever extended violation witness format. URL: https://klever.readthedocs.io/en/latest/dev.html#extended-violation-witness-format, accessed 27.11.2020.

13. Klever error trace format. URL: https://klever.readthedocs.io/en/latest/dev.html#error-trace-format, accessed 27.11.2020.

14. R. Castaño, V. Braberman, D. Garbervetsky, and S. Uchitel. Model checker execution reports. In Proc. of the 32nd IEEE/ACM International Conference on Automated Software Engineering, 2017, pp. 200-205.

15. LCOV. URL: https://github.com/linux-test-project/lcov, accessed 27.11.2020.

16. Klever code coverage format. URL: https://klever.readthedocs.io/en/latest/dev.html#code-coverage-format, accessed 27.11.2020.

17. Klever. URL: https://forge.ispras.ru/projects/klever, accessed 27.11.2020.

18. [PATCH] s5p-jpeg: hangle error condition in s5p_jpeg_probe. URL: https://lkml.org/lkml/2020/11/13/673, accessed 27.11.2020.

19. Klever Bridge issues. URL: https://tinyurl.com/yyq9ljk8, accessed 27.11.2020.


Review

For citations:


GRATINSKIY V.A., NOVIKOV E.M., ZAKHAROV I.S. Expert Assessment of Verification Tool Results. Proceedings of the Institute for System Programming of the RAS (Proceedings of ISP RAS). 2020;32(5):7-20. (In Russ.) https://doi.org/10.15514/ISPRAS-2020-32(5)-1



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.


ISSN 2079-8156 (Print)
ISSN 2220-6426 (Online)