ISSN 0236-235X (P)
ISSN 2311-2735 (E)

Journal influence

Higher Attestation Commission (VAK) - К1 quartile
Russian Science Citation Index (RSCI)

Bookmark

Next issue

2
Publication date:
16 June 2024

Articles of journal № 4 at 2019 year.

Order result by:
Public date | Title | Authors |

1. Fuzzy expert evaluations of adverse external impacts on the marine search and rescue operations effectiveness [№4 за 2019 год]
Authors: V.V. Kochnev , A.S. Rekhov , Sorokin V.E., E.V. Taranukha
Visitors: 3587
The paper considers using fuzzy expert evaluations of adverse external impacts upon effectiveness of various marine search and rescue operations. According to existing methodology evaluation of search and rescue operation effectiveness is cal-culated according to a mathematical model under ideal external conditions. Further it is adjusted by the probability of pairwise independent and joint events of adverse external influences for various types of activities of forces and means of a search and rescue operation, which are characterized by certain un-favorable factors with empirically determined weights (significances) decreasing the efficiency. Ab-sence of gradations for most adverse factors makes evaluation significantly rough. To overcome this fact it is proposed to abandon weights and assess the level of exposure to adverse factors using fuzzy multi-valued verbal expert evaluations. To implement such a transition, linguistic variables of adverse factors impacts are introduced. Their term-sets are determined on the basis of a person’s known abili-ties to distinguish gradations in verbal evaluations. The values of term-sets (fuzzy sets at which maxi-mum membership functions are achieved) carrier base scale are correlated with the available weights (significance) of adverse factors. The paper provides a practical example of fuzzy expert evaluation of adverse factors influence on search and rescue operation effectiveness under the assumption that the adjacent gradations of verbal evaluations by a person are equidistant and that the values on the carrier’s base scale are equal to zero in the absence of adverse factors. The use of widespread normalized triangular membership functions allows effectively evaluating and adequately adjusting the values of search and rescue operation effec-tiveness under the influence of adverse factors even with a small amount of initial data in the form of weights (significance) of adverse factors. Significant simplicity of expert choice deffuzzification im-plementation is achieved provided that the value on the base scale of the carrier can refer to no more than two adjacent terms of the linguistic variable. On the other hand, the proposed approach to use fuzzy expert evaluations can easily be adapted to detail or expand the initial data in the form of weights of adverse factors, and can also serve as the basis for further development of methods for as-sessing search and rescue operations effectiveness.

2. The development of fast software implementation of specialized neural network architecture with sparse connections [№4 за 2019 год]
Author: Yu.S. Fedorenko
Visitors: 4112
The paper is devoted to the development of fast software implementation of a specialized neural net-work architecture. Feature engineering is one of the most important stages in solving machine learning tasks. Nowadays, the algorithms of handcrafted feature selection lose their popularity, giving way to deep neural networks. However, application of deep models is limited in online learning tasks as they are not able to learn in real time. Besides, their using is difficult in high-loaded systems due to signifi-cant computational complexity. In the one of previous articles, the author has proposed a neural network architecture with automat-ic feature selection and the ability to train in real time. However, specific sparsity of connections in this architecture complicates its implementation on the base of classic deep learning frameworks. Therefore, we decided to do our own implementation of the proposed architecture. This paper considers data structures and algorithms developed when writing software implementa-tion. It describes sample processing in details from the program system point of view during model pre-dicting and training. For a more complete description of implementation details, there are UML classes, sequences and activity diagrams. The performance of the developed implementation is compared with implementations of same ar-chitecture on the base of deep learning frameworks. The analysis has shown that the developed soft-ware works an order of magnitude faster than library-based implementations. Such acceleration is due to the fact that the developed implementation is optimized for a specific architecture, while the frame-works are designed to work with a wide class of neural networks. In addition, the benchmarks have shown that the developed implementation of a proposed neural network works only 20-30 percent slower than a simple logistic regression model with handcrafted features. Thus, it can be used in high loaded systems.

3. The algorithm for forming many options of a supercomputer structure to solve military applied problems [№4 за 2019 год]
Authors: Gusenitsa, Ya.N., D.O. Petrich
Visitors: 3963
The paper justifies the necessity of creating and applying supercomputers for military applied prob-lems. It presents the rationale to develop a qualitatively new scientific and methodological apparatus for substantiating supercomputer architectures based on the principles of unification, complexation and integration, which provides maximum performance. There is a description of a supercomputer architecture using the model of a computer association including its structure and algorithm. It is shown that a supercomputer structure should be a software-indivisible hardware resource in the computing process organization. Therefore, it is advisable to form a supercomputer structure according to the principle of modular scalability from the same basic mod-ules (computing clusters) that, on the one hand, have structural limitations, and on the other hand, must be formed for specific military-applied tasks. Each computing cluster includes a balanced number of elementary processors and memory units combined using a fully accessible fast channel system and capable of implementing basic sets of various tasks. The presented algorithm, unlike the existing ones, is based on using the graph spectrum when gen-erating many communication structures between computing clusters, as well as elementary processors in the composition of computing clusters. This significantly reduces the computational complexity of forming many variants of a supercomputer structure. As a result, the proposed algorithm can be used to justify architectural decisions when creating new supercomputers, as well as to justify the allocation of computing resources for solving military applied problems on existing supercomputers.

4. Modeling and analysis of programs for multidimensional interval-logic controllers [№4 за 2019 год]
Authors: A.F. Antipin , E.V. Antipina
Visitors: 4958
The paper examines special software designed for modeling the operation of multidimensional fuzzy interval-logic controllers and analysis of their programs for programmable logic controllers. They can be used at enterprises of chemical, oil and oil refining industries in the development of automatic con-trol systems by technological processes and objects that do not have adequate mathematical models. The relevance of software development arises from the lack of application programs designed to simu-late operation of fuzzy controllers according to available experimental data. The software described in the paper allows calculating the necessary and sufficient number of pro-duction rules and the number of critically important rules that make up a production system. In addi-tion, according to available initial data obtained as a result of experiments, it is possible to create fuzzy models of multidimensional fuzzy interval-logic controllers. The paper presents the results of a computational experiment in creating a fuzzy model of multidi-mensional fuzzy interval-logic controllers and analyzing their programs for programmable logic con-trollers, during which the main parameters of this type of controllers were calculated: the maximum number of production rules that make up the production system; the total number of terms and the number of critically important terms for each variable; the total number of variable groups and the number of critically important variable groups; the maximum number of production rules, the number of critical production rules for each of variable groups and the actual number of critical rules that make up the production system. Based on calculation results, conclusions were drawn regarding the complexity of a production sys-tem for multidimensional fuzzy interval-logic controllers and about how to achieve the required calcu-lation accuracy.

5. . Industrial technologies states management based on risk criterion [№4 за 2019 год]
Authors: S.R. Bakasov , G.N. Sanaeva , Bogatikov V.N.
Visitors: 3370
The present paper is devoted to the study of conceptual formulation of the industrial technologies state management problem. It considers managing a potentially hazardous technology for selective treatment of tail gases from production of non-concentrated nitric acid. This is a practical application of the state management idea. In this application state management is closely connected with industrial systems safety management. One of the problems in the synthesis of industrial technologies safety management systems is the presence of uncertainty both in the knowledge of physicochemical processes and in the uncertainty as-sociated with the influence of random disturbances. This gives rise to the development of new methods for the synthesis of technological safety management systems, as well as to the improvement of exist-ing ones. As a consequence of the above factors, methods for implementing goal-setting mechanisms and revising control quality criteria become a promising approach for this kind of dynamic processes occurring in poorly structured and poorly formalized environments. These methods rest upon funda-mental background knowledge. Various types of defects are reflected in technological processes state variables. Violations can be caused by defects in control systems, process equipment and violations in the technological process itself. All kinds of damage in the technological system (non-compliance with the requirements of source materials, non-compliance with the requirements of regulatory and tech-nical documents and the human factor) lead to similar results. This indicates both the complexity of di-agnosing procedure and the complexity of forming criteria to assess states. At the present time from a management point of view, a technological safety system is a multi-level hierarchically organized technological system. The main goals of such systems are to timely detect malfunctions and to take measures to eliminate their root causes. The paper considers multi-level or-ganization of the technological safety system. As a practical application of the proposed approach the paper considers multi-level organization of the technological safety system for the process of selective treatment of tail gases from the production of weak nitric acid. Authors propose the main technology management criterion (a risk criterion for conducting the technological process) and the impulse model of the criterion. Management is based on predictive management. The developed system allowed not only to increase economic indicators, but also to reduce air pol-lution.

6. . Methods and tools for modeling supercomputer job management system [№4 за 2019 год]
Authors: Baranov, A.V., D.S. Lyakhovets
Visitors: 6634
The paper discusses the methods and tools of modeling supercomputer job management systems, such as SLURM, PBS, Moab, and the domestic management system of parallel job passing. There are high-lighted job management system modeling methods including modeling with real supercomputer system, JMS modeling by a virtual nodes, and simulation modeling. The authors consider methods and tools for constructing a model job stream. The management system of parallel job passing example shows the impossibility of accurate repro-ducing a full-scale experiment with real supercomputer. The paper investigates the adequacy of the job management systems model in a broad and narrow sense. It is shown that an adequate in the narrow sense job management system model ensures compliance only with interval indicators and cannot be used as a forecast model. The authors consider a numerical estimate of the proximity of two event streams in order to determine the adequacy in a broad sense. The first event stream is the stream of real supercomputer events. The second one is the stream of events produced by a job management systems model. The normalized Euclidean distance between two vectors corresponding to the compared streams is proposed as a measure of proximity of two streams. The vectors' dimension is equal to the number of processed jobs, the vectors components are the job residence times in the job management systems. The method of adequacy determination is based on a comparison of the real supercomputer statis-tics and the results of job management systems modeling. The adequacy measure reference value is de-termined as the normalized Euclidean distance between the vectors of job residence times in the real system and in the job management system model.

7. Method and software for intellectual support of making logistics decisions [№4 за 2019 год]
Authors: Borisov V.V., A.V. Ryazanov
Visitors: 3125
The paper proposes intellectual support method for making logistic decisions allowing to solve the complex of the following problems: to determine required resources in the course of orders distribution by territorial zones; to divide logistics service territory into zones based on genetic clustering; to dis-tribute orders by zones according to their purpose; to fuzzily assess and assign logistics facilities to fulfill orders based on the modified G. Kuhn method. To implement the stages of intellectual support method for making logistic decisions a set of proce-dures has been developed, and namely procedures: to divide the service territory into zones based on genetic clustering; to determine the required number of logistics facilities (resources) when distributing orders by zones based on a moving time window; to distribute orders by zones and to assign logistic fa-cilities to fulfill orders based on the modified G. Kuhn method. Distribution of orders by zones evaluation and logistic facilities assignment are based on the inte-gral indicator of logistic facilities and orders degree of compliance, considered in the paper. To deter-mine the indicator, a cascading fuzzy production model has been developed that allows taking into consideration different-quality characteristics of logistics facilities and various strategies for orders distribution. Algorithms and software have been developed implementing the proposed method of logistic deci-sions intellectual support. The software includes: (1) subsystem for division of territory into zones, consisting of genetic clusterization and zonal division program modules; (2) subsystem for orders dis-tribution and logistics facilities assignment, consisting of program modules to determine the required number of logistics facilities for zones, to assess conformity between logistics facilities and orders, and to assign logistics facilities. A comparative assessment was carried out, confirming the improved quality of logistics decisions made using the proposed method and software.

8. Earth's surface visualization in simulation systems [№4 за 2019 год]
Authors: Giatsintov A.M., Mamrosenko K.A., P.S. Bazhenov
Visitors: 4316
Planet visualization is used in many fields: development of geographic information systems, multime-dia systems, simulation systems and simulators. The present paper describes approaches to displaying the Earth's surface that provide real-time visualization. It also lists the problems that arise during ex-tended landscapes visualization and are related to productivity, the coordinates system transformation and visualization accuracy. The paper presents an approach to the Earth's surface visualization that is based upon the use of clipmaps. It allows simplifying data preparation for various parts of the Earth's surface, as well as re-ducing the number of prepared data sets for both the underlying surface and textures with height data. To implement the proposed approach, the architecture of the component used to generate and visu-alize the Earth's surface was developed. The component is built into existing visualization systems. This architecture has the following advantages. It is not necessary to develop own specialized visuali-zation system from scratch. It is possible to use existing visualization systems both open and proprie-tary. Computing load is distributed across available threads in the thread pool. Creation of drawing calls that will be sent to the visualization system is one of multi-threaded pro-cessing examples. The algorithm for processing incoming drawing calls depends on implementation of integration with the particular visualization system. Component's architecture provides the ability to set the time during which the data will be processed inside the component. After each operation is fin-ished the time taken to complete the operation is calculated and the ability to continue processing other tasks is determined. If the time limit is exceeded, task processing will be terminated until the next com-ponent update call.

9. The method for translating first-order logic formulas into positively constructed formulas [№4 за 2019 год]
Authors: Davydov A.V., A.A. Larionov , E.A. Cherkashin
Visitors: 4853
The paper considers the logic calculus of positively constructed formulas (PCF calculus) and based on it automated theorem proving (ATP) method. The PCF calculus was developed and described as a first-order logic formalism in works of S.N. Vassilyev and A.K. Zherlov as a result of formalizing and solv-ing problems of control theory. There are examples of describing and solving some control theory problems, effectively (from the point of view of the language expressiveness and the theorem proving means efficiency) solved using PCF calculus, for example, controlling a group of lifts; directing a tele-scope at the planet center, which is in an incomplete phase, and mobile robot control. Comparing to the capabilities of other logical means for subject domain formalization and logic conclusion search, the PCF calculus have the advantage of the expressiveness combined with the com-pactness of knowledge representation, the natural parallelism of their processing, large block size and lower combinatorial complexity of conclusions, high compatibility with heuristics, and great capabili-ties for interactive proof. The selected class of formulas makes it possible to build constructive proofs. This class of formulas is much wider than the class of Horn clauses used in the Prolog. There are no re-strictions in the logical formalization of the axiomatic base of the subject domain, and the target state-ment is a conjunction of queries (in terms of the Prolog). To test the ATP software system (prover) based on the PCF calculus the authors used the TPTP (Thousands of Problems for Theorem Provers) library. The TPTP format has become a standard in the community that studies automated reasoning. There is a natural need for the developed prover to ac-cept problems in this format as input. Thus, the problem of translating the first-order predicate logic formulas presented in the TPTP format to the POF format arises. This problem is nontrivial due to the special structure of the PCF calculus formulas. The paper proposes a more efficient translation method (compared to the previously developed al-gorithm in the first implementation of the prover based on the PCF calculus) for the first-order predi-cate calculus language preserving the original heuristic knowledge structure, and its simplified version for the problems presented in language of clauses. The efficiency is a number of steps and the length of the obtained formulas. The proposed method was implemented as a software system – a language trans-lator of first-order TPTP logic formulas to the PCF calculus language. The paper presents test results of the developed method, which imply that there is a certain class of first-order formulas that are not tak-en into account as special by existing ATP systems, while the PCF calculus has special strategies that increase the efficiency of the inference search for such class of formulas.

10. Features of using neural network models to classify short text messages [№4 за 2019 год]
Authors: Dli M.I., O.V. Bulygina
Visitors: 12831
Nowadays, public authorities are actively developing technologies of electronic interaction with organ-izations and citizens. One of the key tasks in this area is classification of incoming messages for their operational processing. However, the features of such messages (small size, lack of a clear structure, etc.) do not allow using traditional approaches to the analysis of textual information. To solve this problem, it is proposed to use neural network models (artificial neural networks and neuro-fuzzy classifier), which allow finding hidden patterns in documents written in a natural lan-guage. The choice of a specific method is determined by the approach to forming thematic headings: convolutional neural networks (for unambiguous definition of rubrics); recurrent neural networks (for significant word order in the title of rubrics); neuro-fuzzy classifier (for intersecting thesauri of ru-brics).

| 1 | 2 | 3 | Next →