ISSN 0236-235X (P)
ISSN 2311-2735 (E)

Journal influence

Higher Attestation Commission (VAK) - К1 quartile
Russian Science Citation Index (RSCI)

Bookmark

Next issue

2
Publication date:
16 June 2024

Articles of journal № 3 at 2012 year.

Order result by:
Public date | Title | Authors

21. Playback of heterogeneous videos on 3d object’s edges in visualization subsystem of training simulation system [№3 за 2012 год]
Author: Giatsintov A.M.
Visitors: 9192
This article describes a new method of playing heterogeneous videos on edges of objects in virtual 3d scene. Architecture of video files decoder is presented, algorithms for syncing audio and video data are depicted, as well as methods of uploading decoded video frames to the videocard memory. Limitations and issues of video playback in virtual scene are analyzed. Training Simulation Systems (TSS) should conform to the requirements of personnel training methods and usually contain a large number of informational resources. One type of resources that is used in training is video. Main advantages of using videos in TSS are: ability to visualize different processes and integration of instructor image into the virtual 3d scene. Video playback in virtual 3d scene is a complex task because there are factors to consider that are not relevant when playing video in media player. Visualization subsystem must be able to achieve acceptable framerate (not lesser than 25 frames per second) and react to external actions, such as scene parameters change and loading of additional objects. Developed architecture of video playback subsystem makes it possible to decode and visualize several high definition videos in 3d scene. The developed architecture consists of following components: video decoder that decodes video and audio packets; sound subsystem that plays the decoded sound; control module that is used to play specific video, pause it, set its volume and so on; interface that glues the video playback subsystem with visualization subsystem.

22. Improvement of debugging and testing increase performance techniques in multiprocessor systems [№3 за 2012 год]
Author: Lavrinov G.A.
Visitors: 11017
Very important element of multiprocessor computing system is communication network used by process to provide communication with other processors or memory. Together with VME, PCI Express, HyperTransport and other buses of inter-processor communication, RapidIO interface is developed as well. Microprocessor systems based on RapidIO suffer significant difficulty and present a serious problem at debugging and beginning testing stage of trial and developmental models. Full scale testing is made with operating systems (in this case – Linux and OS PB Baget 2.0 and 3.0), however, successful start of operating systems, communication network and processor units are supposed to provide proper operating performance. Test designer has only hardware and ROM program that gets control automatically after power is on or RESET signal is received. This article provides two testing and debugging techniques of multiprocessor systems realized with RapidIO interface that can work with minimum amount of hardware. The article also provides comparison of these techniques viewing effectiveness in use and testing stage made on its basis. Using UML sequence chart, the article presents protocol that implements integrated RapidIO console and I/O RapidIO communication protocol for gathering information about testing results. It is shown how to use specific RapidIO packages. These testing techniques of multiprocessor systems became a basis for performing system test based on 1890BM6YA chips .

23. Techniques provided fof valnerabilities reduction in special real time software [№3 за 2012 год]
Author: Нархов К.Г.
Visitors: 12801
This article is dedicated to automation of a programmer’s work, particularly in the area of vulnerabilities reduction and bugs in the program code. The article reviews details of the program design in technical facilities of automated code generation for special software (TFACG SS) and use of library contained in TFACG SS that reduces potential vulnerabilities appearing in new programs. The author provides taxonomy of typical vulnerabilities in real time programs, he makes review of each class of vulnerability, how often and why does the vulnerability occur, and its prevention using TFACG SS facilities. Some potential vulnerabilities take in account configuration of real time operating system. Taxonomy of vulnerabilities was made with a static estimating device and set of real time source code designed by Institute for Scientific Research of Russian Academy of Sciences. Set of the source codes includes 204 of the program modules (more than 111700 lines). The article finishes with example of reduction of potential vulnerabilities in real time source code generation program (PVSC RT), which is a part of TFACG SS. The article shows method of reduction of vulnerabilities using standard program patterns provided by TFACG SS, this method repaired all vulnerabilities that were found by static estimating device in PVSC RT. The library expansion for standard TFACG SS patterns and the supplementation of the static analyzer rules, which will include tests and conditions specific for real time operating system, are the main prospects of solutions presented in the article.

24. Algorithm of X-graph growth and principles of physics [№3 за 2012 год]
Authors: Коганов А.В., Круглый А.Л.
Visitors: 8128
The work is focused on the current trend at the intersection of theory of automata and algorithms, graph theory, as well as mathematical physics. Over the last years the theory of growing X-graphs is developed, with regard to which the X-graphs by each of their points (X-element) simulate elementary interaction of two initial particles with generation of two resulting particles. Growth of such graph simulates obtaining by the observer of information on physical processes taking place in its space-time neighborhood. The algorithm of incremental formation of X-graph is studied, which meets a set of requirements necessary for discrete space-time model in quantum physics. Special attention is given to implementation of causality principle for the algorithm that makes its interpretation as a physical process observer model to be correct. A new algorithm possesses useful properties, which were not present in the earlier proposed analogous algorithms. The main of them is independence of probability of completion of cause-unbounded by pairs peaks set from the order of introduction of those peaks. The basis of this algorithm is a new way of selection of edges for addition of a new X-element. This is done by means of random paths to the boundary from the randomly chosen peak from the number of peaks which are already present in the graph. Algorithm is interesting from the point of view of the theory of self-organizing of complex growing systems. Its modification and variations of initial states allow models of various systems of pair-wise interactions to be built.

25. Microbenchmarks for microprocessor RTL-models performance assessment [№3 за 2012 год]
Author: Николина Н.В.
Visitors: 6071
An approach for evaluating and monitoring performance of the microprocessor on design stage is considered. The technique to estimate performance of separate blocks is offered, any potential influence of others is thus ignored. We propose a test suite for evaluating performance of microprocessor RTL-models (Register transfer level). The test set consists of the short programs (microtestbenches) directed on performance evaluating of separate blocks. A choice of test set for different modules is realized on taking into account features of its work. An article presents a number of test situations for evaluating such modules as instruction fetch and dispatch buffer (IFDB), floating point unit (FPU) and memory management unit (MMU) for MIPS-like architecture. For the analysis of run time performance counters are used, which are the parts of control coprocessor registers of microprocessor. Automation for creating test cases, regression performance measurement and visualization of performance evaluation results are proposed. During reasonable time the test system allows to receive results of a performance evaluation and to compare it with results of the previous versions of RTL model, or with reference values. Also the impact of performance measurements on architecture of future chip is considered. The possibility to investigate the influence on microprocessor performance of such factors as changing memory frequency is shown. The results of measurements are shown on example of performance evaluation of superscalar microprocessor which is developed in SRISA RAS. The results were confirmed on final crystal.

26. Role of stochastic testing in microprocessors functional verification [№3 за 2012 год]
Author: Хисамбеев И.Ш.
Visitors: 10148
With the growth of the performance requirements of modern IC, including microprocessors and Systems-on-Chip, their development complicates considerably. It has become a multistage process, and there are many sophisticated tasks to be done on each stage. One of the most labor-consuming tasks is design functional verification. Its goal is to approve a conformity between implementation of a design and it's specification functional requirements. While it does not yet have a general solution, considering modern IC designs complexity, several complementary approaches were developed to address it. One of them is stochastic testing. It was applied for researching of the MIPS64 architecture microprocessors in Science research Institute for system analysis of RAS. The method is based on the test program execution simulation. Test programs are generated automatically from the given template. Instructions, arguments and settings for the test are chosen randomly considering given biases and constraints. This paper is a review, aiming at specifying the role of stochastic testing with its application scope, advantages and disadvantages. In the introduction functional verification is considered in general, as a part of IC design workflow. Then, most well-known verification approaches are reviewed, their underlying ideas analyzed briefly. Particularly, simulation-based methods are considered. Finally, stochastic testing method is described in the given background. Conclusions concerning its advantages and disadvantages are illustrated with some results of its application in SRISA RAS.

27. Verification of a microprocessor and its RTL-model by means of ОС Linux user applications [№3 за 2012 год]
Author: Chibisov P.A.
Visitors: 11121
This article covers the methods of verification and testing of modern microprocessors. The special attention is given to a method of testing the RTL-models, FPGA-prototypes and test crystals of microprocessors by real user applications for the Linux operating system. The interrelation of these objects and degree of a discussed technique applicability to each of them in a context of the general verification plan are considered either. The article lists the merits and demerits of the method. As simulation speed of programs on RTL-model of a microprocessor is extremely slow, it is offered to use the cut-restore mechanism of a model state for splitting all instruction sequence of an operating system booting into a set of subsequences which are carried out in parallel on different computers. Existence of a large quantity of freely distributed programs with an open source code with the built-in automated self-test mechanisms makes it possible to mark out an applications launch for Linux OS in a separate approach for testing of universal microprocessors. The described method doesn't exclude, but serves as a supplement to a modern set of methods and means of testing and verification of microprocessors and their models. Many authoritative developers and manufacturers of microprocessors recognize the usefulness of the earlier booting of any operating system on the RTL-model under development. Success in this operation often makes the developers more confident that their work is done correctly, than tens of thousands of executed tests. The article provides the example of the representative test set which makes it possible to use the ready-made user software packages, as well as the examples of the test program sources. Besides, it considers the general algorithm of actions to find a bug in the microprocessor and gives examples of bugs revealed in the microprocessor with MIPS64 architecture.

28. Metadescriptions and cataloguing of scientific information resources of the RAS [№3 за 2012 год]
Authors: Еркимбаев А.О., A.B. Zhizhchenko, Зицерман В.Ю., Кобзев Г.А., Serebryakov V.A., A.N. Sotnikov, Шиолашвили Л.Н.
Visitors: 10385
A large part of scientific knowledge is formalized in the form of electronic resources – data and knowledge bases, electronic reference books, etc. Work with electronic resources, including their adaptation to the subject area, systematization and accumulation of data, achieved an equal status with theory and experiment. There appeared such subjects as bio- and geoinformatics, which subject of study is submission of complex data. However, with the spreading of databases and similar means deep problems arose caused by lack of interoperability. Autonomy of resources functioning, diversity of data formats and structures, lack of data presentation standards – not all the reasons complicating the data exchange. In the global and domestic practice in recent years there have been selected approaches to possible resolving problems using versions of XML language, for standardizing of metadata system and terms dictionaries within a certain area of expertise, such as CML versions for submission of chemical data, MatML – for material science, ThermoML – for thermodynamics. An insistent in elaborating principles and technologies for integration of many RAS resources led to the formation of an extensive program on creation of so-called Data Centre. It is expected that this project will help to overcome the fragmentation and limited availability of digital resources in the form of databases, electronic publications, data-processing tools, supported by various institutes of the Russian Academy of Sciences. In this work as the first phase of the integration is offered the system of resources certification, adequately reflecting the subject area, resource types, access conditions, etc. A portal is developed on which there is an extensive set of metadata for each registered resource.

29. Data integration and query language in large information infrastructures [№3 за 2012 год]
Authors: Коваленко В.Н., Куликов А.Ю.
Visitors: 6714
Automation of various forms of professional activity by means of computer technologies generates arrays of information stored in databases. This information is used, first, inside organizations, but can be helpful for solving important tasks outside their borders. Development of the corresponding applications is considerably complicated in the absence of the specialized system tools supporting access to data from multiple databases. In this direction, named data integration, the general, application-independent methods, allowing consolidation of heterogeneous databases are developed. The tools created on such basis are used in practice, however the problem of their scalability on the number of integrated databases remains open. In the paper, an approach to the problem of mass integration (tens and hundred databases) is described. The two questions are considered that seems to be the most essential under these conditions: the method of data integration and the type of informational queries. The integration method allows defining a representation (the global scheme) in which the data of the integrated databases form a common unified space. The method is aimed at the creation of information infrastructures with dynamically changing database set: change of a set does not require modification of the global scheme or existing applications. The language of search queries is an SQL-92 extension, with the difference that the operations are executed on subsets of databases. In addition, databases are not addressed explicitly: descriptive information – meta-attributes – is used for their se-lection. Such type of queries allows creating applications capable of processing data from varied sets of sources.

30. Software reliability analysis based on the model of inhomogeneous poisson process and bootstrap methods [№3 за 2012 год]
Authors: Guda A.N., Chubeyko S.V.
Visitors: 7142
A new mathematical model of software reliability is described, based on mathematical model of inhomogeneous Poisson process. The basic idea of the proposed in the article forecasting method is a method of reproduction of data samplings, containing two original sets: the cumulative program execution time and number of errors committed during this time. As a method of a randomized sampling reproduction a bootstrap technology is taking which uses random quantities, having a Poisson distribution. Algorithms of parameter estimate and forecasting indicators of software reliability are suggested. The first algorithm is used to assess the intensity of errors expected in subsequent versions of the software. The algorithm uses a random number sensor on which basis randomized sampling and random arrays are arranged under Poisson law. The second algorithm makes it possible to evaluate the intensity of error detection. He uses data samples from the first algorithm and operates according to the method of maximum likelihood. The article describes the general procedure for forecasting the expected number of errors that can occur during a subsequent program run at a certain time interval, following a cumulative period of observation. The proposed forecasting method was realized in the form of a program written in Pascal programming language in the free programming environment PascalABC.NET. It also describes examples of using forecasting software with some test data.

← Preview | 1 | 2 | 3 | 4 | 5 | 6 | Next →