Functional Testing of Language Virtual Machines Based on Formal ISA Specifications
https://doi.org/10.15514/ISPRAS-2025-37(1)-4
Abstract
Nowadays, programming languages that use language virtual machines (VMs) in their infrastructure are very popular, which stimulates the development of this technology. Developing a language VM is a complex process, during which errors can be introduced into the designed system. To ensure the quality of the VM implementation, the development process includes a testing stage. During testing, it is necessary to answer two main questions: how to create test programs and how to check the result of their execution. The article presents a method for functional testing of language VMs based on formal specifications of the instruction set architecture (ISA). The work describes the implementation of the proposed approach. Based on the VM documentation, the ISA is specified using the architecture description language. An executable model is built based on the ISA specification. Test templates, which are parameterized descriptions of test programs, are used to automate the creation of test programs. When creating test templates, a special domain-specific language is used, which allows you to specify various generation techniques and use the VM bytecode obtained from formal specifications. In the presented method, test templates can be described manually, generated automatically in accordance with the target criterion, or obtained from third-party generators. Based on the test templates and the executable model, test programs are generated in bytecode aimed at checking a certain functionality or properties of the system under test. Bytecode is a natural language for the VM and allows you to affect all of its functionality. The test program is translated into a binary program and executed on the VM. During the program execution, an execution trace is collected on the VM. A trace adapter is created to analyze the execution trace. A test oracle is built based on the executable model and the trace adapter. The oracle checks the test results by comparing the execution trace with the results of executing the binary program on the executable model. The method is implemented in the MicroTESK tool version 2.6 and was used to test the Ark (Panda) VM.
About the Author
Alexander Sergeevich PROTSENKORussian Federation
A researcher at the Software Engineering Department of ISP RAS. His research interests include language virtual machines, microprocessors, instruction set architecture, verification and testing.
References
1. Smith J., Nair R. Virtual machines: versatile platforms for systems and processes. – Elsevier, 2005.
2. Li X. F. Advanced design and implementation of virtual machines. – CRC Press, 2016.
3. Amdahl G. M., Blaauw G. A., Brooks F. P. Architecture of the IBM System/360 //IBM Journal of Research and Development. – 1964. – Т. 8. – №. 2. – С. 87-101.
4. Howden W. E. Theoretical and empirical studies of program testing //IEEE Transactions on Software Engineering. – 1978. – №. 4. – С. 293-298.
5. Open-source проект Ark (Panda) Runtime [Электронный ресурс]. – Режим доступа: https://gitee.com/openharmony/arkcompiler_runtime_core/tree/master/static_core/
6. Polito G., Ducasse S., Tesone P. Interpreter-guided differential JIT compiler unit testing //Proceedings of the 43rd ACM SIGPLAN International Conference on Programming Language Design and Implementation. – 2022. – С. 981-992.
7. Sirer E. G., Bershad B. N. Using production grammars in software testing //ACM SIGPLAN Notices. – 1999. – Т. 35. – №. 1. – С. 1-13.
8. Java Language and Virtual Machine Specifications [Электронный ресурс]. – Режим доступа: https://docs.oracle.com/javase/specs/
9. Sirer E. G., Bershad B. N. Testing Java virtual machines //Proc. Int. Conf. on Software Testing And Review. – 1999.
10. Chen Y. et al. Coverage-directed differential testing of JVM implementations //proceedings of the 37th ACM SIGPLAN Conference on Programming Language Design and Implementation. – 2016. – С. 85-99.
11. Chen Y., Su T., Su Z. Deep differential testing of JVM implementations //2019 IEEE/ACM 41st International Conference on Software Engineering (ICSE). – IEEE, 2019. – С. 1257-1268.
12. Zhao Y. et al. History-Driven Test Program Synthesis for JVM Testing. In 2022 IEEE/ACM 44th International Conference on Software Engineering (ICSE). 1133–1144 [Электронный ресурс]. https://doi.org/10.1145/3510003.3510059
13. Wu M. et al. JITfuzz: Coverage-guided Fuzzing for JVM Just-in-Time Compilers. In 2023 IEEE/ACM 45th International Conference on Software Engineering (ICSE). 56–68 [Электронный ресурс]. DOI: 10.1109/ICSE48619.2023.00017
14. Wu M. et al. SJFuzz: Seed and Mutator Scheduling for JVM Fuzzing //Proceedings of the 31st ACM Joint European Software Engineering Conference and Symposium on the Foundations of Software Engineering. – 2023. – С. 1062-1074.
15. Zang Z. et al. Compiler testing using template java programs //Proceedings of the 37th IEEE/ACM International Conference on Automated Software Engineering. – 2022. – С. 1-13.
16. Calvagna A., Tramontana E. Automated conformance testing of Java virtual machines //2013 Seventh International Conference on Complex, Intelligent, and Software Intensive Systems. – IEEE, 2013. – С. 547- 552.
17. Misse-Chanabier P. Testing a virtual machine developed in a simulation-based virtual machine generator: дис. – Université de Lille, 2022.
18. Hrishikesh M.S., Rajagopalan M., Sriram S., Mantri R. System Validation at ARM — Enabling our Partners to Build Better Systems. White Paper. April 2016
19. Mahapatra R.N., Bhojwani P., Lee J., and Kim Y. Microprocessor Evaluations for Safety-Critical, Real-Time Applications: Authority for Expenditure No.43 Phase 3 Report. 2009. 43 P.
20. Adir A. et al. Genesys-pro: Innovations in test program generation for functional processor verification //IEEE Design & Test of Computers. – 2004. – Т. 21. – №. 2. – С. 84-93.
21. Li T. et al. MA/sup 2/TG: a functional test program generator for microprocessor verification //8th Euromicro Conference on Digital System Design (DSD'05). – IEEE, 2005. – С. 176-183.
22. Камкин А. С. Генерация тестовых программ для микропроцессоров //Труды Института системного программирования РАН. – 2008. – Т. 14. – №. 2. – С. 23-63.
23. Kamkin A. Combinatorial model-based test program generation for microprocessors //Preprint of ISPRAS. – 2009.
24. Chupilko M., Kamkin A., Protsenko A., Smolov S., Tatarnikov A. MicroTESK: Automated Architecture Validation Suite Generator for Microprocessors. DVCon Europe 2018.
25. Kamkin A., Tatarnikov A. MicroTESK: A Tool for Constrained Random Test Program Generation for Microprocessors //Perspectives of System Informatics: 11th International Andrei P. Ershov Informatics Conference, PSI 2017, Moscow, Russia, June 27-29, 2017, Revised Selected Papers 11. – Springer International Publishing, 2018. – С. 387-393.
26. Peña R. et al. SMT-Based Test-Case Generation and Validation for Programs with Complex Specifications //Analysis, Verification and Transformation for Declarative Programming and Intelligent Systems: Essays Dedicated to Manuel Hermenegildo on the Occasion of His 60th Birthday. – Cham: Springer Nature Switzerland, 2023. – С. 188-205.
27. Khurshid S., Marinov D. TestEra: Specification-based testing of Java programs using SAT //Automated Software Engineering. – 2004. – Т. 11. – С. 403-434.
28. Open-source project MicroTESK [Электронный ресурс]. – Режим доступа: https://forge.ispras.ru/projects/microtesk
29. Freericks M. The nML machine description formalism. – Leiter der Fachbibliothek Informatik, Sekretariat FR 5-4, 1991.
30. Vishnoi S. K. Functional simulation using sim-nml //Master's thesis, Department of Computer Science and Engineering, IIT Kanpur. – 2006.
31. Татарников А.Д. Автоматизация конструирования генераторов тестовых программ для микропроцессоров на основе формальных спецификаций. Диссертация на соискание ученой степени к.т.н. Институт системного программирования РАН, Москва, 2017. 162 c.
32. Проценко А. С., Татарников А. Д. Автоматическая генерация тестовых шаблонов на основе спецификаций системы команд //Новые информационные технологии в исследовании сложных структур. – 2018. – С. 82-83.
33. Kamkin A., Khoroshilov A., Kotsynyak A., Putro P. Deductive binary code verification against source-code-level specifications //Tests and Proofs: 14th International Conference, TAP 2020, Held as Part of STAF 2020, Bergen, Norway, June 22–23, 2020, Proceedings 14. – Springer International Publishing, 2020. – С. 43 - 58.
34. Проценко А. С. Архитектурно независимые тестовые шаблоны для виртуальных машин и микропроцессоров. ИНФОРМАЦИОННЫЕ ТЕХНОЛОГИИ, Том 30, № 11, 2024, С. 579–584. doi:10.17587/it.30.579-584
35. Zelenov S., Zelenova S. Model-based testing of optimizing compilers //International Workshop on Formal Approaches to Software Testing. – Berlin, Heidelberg: Springer Berlin Heidelberg, 2007. – С. 365-377.
36. gcov – a Test Coverage Program [Электронный ресурс]. – Режим доступа: https://gcc.gnu.org/onlinedocs/gcc/Gcov.html
37. TMS570LS31x/21x Series Microcontroller, Silicon Errata. Texas Instruments, 2013. https://www.ti.com/lit/er/spnz195g/spnz195g.pdf
38. dsPIC30F5011/5013 Rev A1/A2 Silicon Errata. Microchip, 2005. https://ww1.microchip.com/downloads/en/DeviceDoc/80210e.pdf
Review
For citations:
PROTSENKO A.S. Functional Testing of Language Virtual Machines Based on Formal ISA Specifications. Proceedings of the Institute for System Programming of the RAS (Proceedings of ISP RAS). 2025;37(1):65-86. (In Russ.) https://doi.org/10.15514/ISPRAS-2025-37(1)-4