Designing classes’ interfaces for neural network graph model
https://doi.org/10.15514/ISPRAS-2019-31(4)-6
Abstract
An approach to testing artificial neural networks is described. The model of neural network is based on graph theory, and operations that are used in theoretical works devoted to graphs, trees, paths, cycles, and circuits. The methods are implemented in a C++ program in the form of a set of data structures and algorithms for their processing. C++ classes are used as data structures for implementing the processing of such objects as a graph vertex, edge, oriented and undirected graph, spanning tree, circuit. Lists of standard methods (constructors and destructors, different assigning operations) are given for all classes. Additional operations are represented in details, and among them – adding one graph to another graph, adding an edge to a graph, removing edges and vertices from graph, normalizing graph, and some more. Many different searching operations are offered. Variants of graph sorting operations are also included into graph model, some of them are similar to array sorting algorithms, and some are more specific. Above these low-level operations several more complex operations are considered as graph model components. These operations include building spanning tree for arbitrary graph, building cograph for spanning tree of a graph, discovering circuits in a graph, evaluating of circuit sign, and so on. Examples of the interfaces of the most important overloaded operations on the objects used are given. An example is given of implementation of one of testing procedures where overloaded operations of graph model objects are used.
Keywords
About the Authors
Yuri Leonidovich KarpovRussian Federation
Candidate of Technical Sciences, Head of Department
Irina Anatolievna Volkova
Russian Federation
Candidate of Physics and Mathematics, Associate Professor in the Department of Algorithmic Languages, Faculty of CMC
Alexey Alexandrovich Vylitok
Russian Federation
Candidate of Physics and Mathematics, Associate Professor in the Department of Algorithmic Languages, Faculty of CMC
Leonid Evgenievich Karpov
Russian Federation
Doctor of Technical Sciences, Leading Researcher at ISP RAS, Associate Professor of the System Programming Department of the VMK Faculty
Yuri Gennadievich Smetanin
Russian Federation
Doctor of Physical and Mathematical Sciences, Chief Researcher of the Dorodnicyn Computing Centre of FRC «Informatics and Control» RAS, senior researcher at the Department of Intelligent Systems, Faculty of Physics and Technology of MIPT
References
1. Ciresan D., Meier U., Masci J., and Schmidhuber J. Multi-column deep neural network for traffic sign classification. Neural Networks, vol. 12, 2012, pp. 333-338.
2. David Talbot. CES 2015: Nvidia Demos a Car Computer Trained with “Deep Learning”. MIT Technology Review, January 6, 2015, available at: https://www.technologyreview.com/s/533936/ces-2015-nvidia-demos-a-car-computer-trained-with-deep-learning/.
3. Roth S. Shrinkage Fields for Effective Image Restoration. In Proc. of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2014, pp. 2774 - 2781.
4. Deng L. and Yu D. Deep Learning: Methods and Applications, Foundations and Trends in Signal Processing vol. 7, no. 3–4, 2014, pp. 1–19.
5. Ю.Л. Карпов, Л.Е. Карпов, Ю.Г. Сметанин. Адаптация общих концепций тестирования программного обеспечения к нейронным сетям. Программирование, т. 44, № 5, 2018, стр. 43-56. DOI: 10.31857/S013234740001214-0 / Yu.L. Karpov, L.E. Karpov, Yu.G. Smetanin, Adaptation of General Concepts of Software Testing to Neural Networks. Programming and Computer Software, vol. 44, № 5, 2018, pp. 324-334. DOI: 10.1134/S0361768818050031.
6. Rosenblatt Frank. Principles of Neurodynamics: Perceptrons and the Theory of Brain Mechanisms. Spartan Books, Washington DC, 1961, 616 p.
7. Kohonen Т. Self-Organization and Associative Memory. New York: Springer, 1984, 332 p.
8. Grossberg S. Nonlinear Neural Networks: Principles, Mechanisms, and Architectures. Neural Networks, vol. 1, issue 1, 1988, pp. 17-61.
9. Hebb D.O. The Organization of Behavior. Wiley, New York, 1948, 335 p.
10. Harary F. Graph theory, Addison Wesley, 1969, 273 p.
11. Ore O. Theory of graphs. American Mathematical Society, Providence, RI, 1962, 269 p.
12. Иорданский М.А. Конструктивная теория графов и её приложения. Из-во Кириллица, 2016, 172 стр. / Iordanski M.A. Constructive graph theory and its applications. Cyrillic, 2016, 172 p. (in Russian).
13. J.J. Hopfield. Neural networks and physical systems with emergent collective computational abilities. Proceedings of the National Academy of Sciences of the USA, vol. 79 no. 8, 1982, pp. 2554–2558.
14. Hinton D.E. and Seinowski Т. Optimal Perceptual Inference. In Proc. of the IEEE Conference on Computer Vision and Pattern Recognition, 1983, pp. 448–453.
15. Julio Aracena, Jacques Demongeot, and Eric Goles. Positive and Negative Circuits in Discrete Neural Networks. IEEE Transactions on Neural Networks, vol. 15, No. 1, Jan. 2004, pp. 77 - 83.
16. Ю.Л. Карпов, Л.Е. Карпов, Ю.Г. Сметанин. Устранение отрицательных циклов в некоторых структурах нейронных сетей с целью достижения стационарных решений. Программирование, т. 45, № 5, стр. 25-35, 2019. DOI: 10.1134/S0132347419050029 / Yu.L. Karpov, L.E. Karpov, Yu.G. Smetanin. Elimination of Negative Circuits in Certain Neural Network Structures to Achieve Stable Solutions. Programming and Computer Software, 2019, Vol. 45, № 5, pp. 241-250, 2019, DOI: 10.1134/S0361768819050025.
Review
For citations:
Karpov Yu., Volkova I.A., Vylitok A.A., Karpov L.E., Smetanin Yu.G. Designing classes’ interfaces for neural network graph model. Proceedings of the Institute for System Programming of the RAS (Proceedings of ISP RAS). 2019;31(4):97-112. (In Russ.) https://doi.org/10.15514/ISPRAS-2019-31(4)-6