ISSN 0236-235X (P)
ISSN 2311-2735 (E)


Next issue

Publication date:
16 March 2018

Articles of journal № 1 at 2017 year.

Order result by:
Public date | Title | Authors

1. Neural network ensembles storage development [2017-03-13]
Authors: Puchkov E.V., S. Terekhov
Visitors: 2779
An important tool in the work of a data analysis and machine learning expert is software for an experiment organization. This is primarily related to a large number of stages in data processing and the characteristic aspects of their im-plementation. In the course of this work the authors have designed and developed a prototype of neural network ensemble storage for data structured storing on various stages of time series forecasting. The article considers a data model, data storage architecture and mechanisms of data acquiring and redistribution in the storage. There is also a description of the developed class model for software-based interaction with the storage. In order to store data on objects and relationships between these objects there has been used MySQL. For storing time series we used non-relational database InfluxDB. There is also user interface with data visualization and easy interaction with the neural network ensembles storage. The system has been tested using solar activity data in the period from January 1700 to February 2015. The experiment (using LSTM recurrent network) showed that an error of a neural network ensemble was lower than an error of each individual neural network model. LSTM was built using the library Keras, the Blending approach was used to form an ensemble. The results of this work indicate the prospects of the developed software solution and provide a high degree of integration into scalable Python software. The development of a fully functional system will allow not only organizing the data analysis process, but also improving the performance of resulting models due to ensemble formation process automation.

2. Intelligent decision support in process scheduling in diversified engineering [2017-03-13]
Authors: Burdo G.B., Semenov N.A.
Visitors: 2203
In the last fifteen years the structure of machine-building and instrument-making production has undergone major changes due to the requirements of customers to receive high-tech products at a certain time. This fact made relevant companies to design and manufacture a large number of different products simultaneously. It has led them to diversification. Historically, diversified engineering and instrumentation enterprises were not equipped with automated tools to manage technological processes effectively. This fact might be explained by high acceleration capacity of their production systems, lack of repeatability in a production list and manufacturing situations, as well as influence of random factors that violate a normal process flow status. All this leads to elongation and disruption of product delivery time, and as a result, to the deterioration of financial and economic performance data of enterprises and firms. In this regard, it becomes clear that creation of automated decision-making support systems in automated technological process control systems is an important problem. Dispatching of technological process is focused on their introduction into a normal schedule. It is one of the most important components in management. In this work we implemented a combined approach to making controlling actions. Based on a large number of random disturbances, an automated system records the most important and most probable of them. Therefore, by comparing and analyzing planned and actual times (start and end times) of technological process operations, possible situation development (accumulation or reduction of disagreement) the system accumulates the results and identifies the most likely causes of plan failure and possible control actions. The analysis is performed using a knowledge base constructed on the basis of production models. The identified causes are “tips” for the second phase. At this stage with a predetermined frequency or at the occurrence of the exception a group of experts from company employees discusses and evaluates alternatives. Fuzzy control defines a weighted assessment of experts’ confidence in achievability of a desired result by executing various control action and the final decision is accepted.

3. Software interface design using elements of artificial intelligence [2017-03-13]
Authors: Zubkova T.M., E.N. Natochaya
Visitors: 2718
In order to develop high-quality software it is necessary to reflect all customer requirements in the specification, thus, to have a global view on the future software for customers and performers. One of the options to achieve mutual understanding is to develop a prototype of a user interface. The article describes the methods of selecting an alternative version of the interface template using such artificial intelligence methods as expert evaluation and the fuzzy-set theory. Users might be are divided into five groups on the basis of individual characteristics (a newbie, usual, experienced, skilled, an administrator). The article defines the basic parameters of individual characteristics which may help to classify users when designing interfaces (computer literacy, systematic experience, experience of working with similar programs, typing, thinking, memory, motor skills, blindness, concentration, emotional stability). The paper describes mathematical support and software for solving the problems of intelligent user interface design. Task implementation is performed in three stages. The first stage is “Forming and assessing expert group competence”. It defines the characteristics of experts. A quantitative description of experts’ characteristics is based on the calculation of relative ratios of competence according to the results of experts’ statements on the Advisory group. The second stage is “Group expert assessment of the object with direct assessment”. It determines recurrent relations for iterations. The third phase is “Building a fuzzy model on fuzzy binary relations”. It operates by two fuzzy sets: a set of user groups and a variety of interface templates that are maximally effective for users with these characteristics. Fuzzy model input data are selected fuzzy sets, the output data are the degrees of matching interface patterns to users. The user interface design process is automated on the basis of the proposed methodology in order to improve objectivity and optimize decisions taken by software developers.

4. Implementation of reinforcement learning methods based on temporal differences and a multi-agent approach for real-time intelligent systems [2017-03-13]
Authors: A.P. Eremeev , A.A. Kozhukhov
Visitors: 3091
The paper describes implementation of reinforcement learning methods based on time (temporal) differences and a multi-agent technology. The authors examine the possibilities of combining learning methods with statistical and expert methods of forecasting for further integration into an instrumental software environment to use in modern and advanced real-time intelligent systems (RT IS), a type of real-time intelligent decision support systems (RT IDSS). There is an analysis of reinforcement learning (RL-learning) methods in terms of using them in RT IS, main components, benefits and tasks. The paper focuses on the methods of RL-learning based on time (temporal) differences (TD-methods) and presents the developed corresponding algorithms. The authors consider the possibility of including RL-learning methods into a multi-agent environment and combining them with statistical and expert forecasting methods in terms of integration into the environment, which was developed for RT IDSS for complex technical object control and diagnosis. The paper proposes the architecture of the forecasting subsystem prototype consisting of an emulator, which simulates the state of environment, forecasting module, analysis and decision-making module and a multi-agent RL-learning module. There is software implementation of the forecasting subsystem prototype using a multi-agent approach in order to solve the problem of the complex technological object expert diagnosis. According to the results of testing and validation of the developed system, the paper considers the conclusions about the efficiency and expediency of including into the RT IDSS.

5. Modelling and simulation of Black Hole attack on wireless networks [2017-03-13]
Authors: V.V. Shakhov, A.N. Yurgenson, O.D. Sokolova
Visitors: 2466
The technologies based on wireless sensor networks can be used in a wide range of vital applications. There are several implementations of the Internet of Things architecture based on wireless sensor networks. For example, the core objective in the projects of the 7th Framework Programme funded by the European Union was to provide the technical foundation for WSN technology in IoT products and services. As wireless sensor networks based applications are deployed, security becomes an essential requirement. In this paper the authors discuss the state-of-arts for security issues in WSN. It is impossible in all cases to provide absolute protection and eliminate the consequences of intrusion. However, the effective range of protection mechanisms will significantly reduce the damage. To achieve this it is necessary to develop and explore appropriate mathematical models. The paper focuses on the attack named Black Hole. This attack has one of the most dangerous destructive information impacts. As a result, more than 90 % of the information transmitted to the sink may be lost. The direct transmission of information between the nodes in WSN is possible if they are within each other's radio reachability. Therefore, the unit disk graphs (UDG-graphs) might be used as a wireless network model. Communication in these networks are described by UDG-models the most appropriate. To simulate data transmission by the routing algorithm in the graph, a spanning tree is constructed. The authors have obtained the formula for calculating analytical estimates for some cases of a spanning tree structure. To assess the vulnerability of this tree to attacks the authors used the value “normalized number of vertices with lost information”. It shows the average number of vertices which lose information, divided by the total number of nodes on the tree. The analytical results are consistent with simulation results. The paper offers a counteracting method against Black Hole and provides the corresponding performance analysis as well.

6. Forecasting in dynamic system control [2017-03-13]
Author: Tikhanychev O.V.
Visitors: 3141
The condition of complex systems management adequacy is a stock of information on their current state and operation conditions. Typically, such data is obtained from environment monitoring systems. But usual monitoring of dynamic systems does not always provide effective management. In some cases it is appropriate to introduce a feedback to a control loop. But this approach does not always work, especially when managing large distributed systems with high inertia. In order to ensure management efficiency it is necessary to use feedback. It means not only monitor the status of a system and the environment, but also to obtain information on their possible changes in advance, that is to use forecasting methods. Now it is common practice to divide all forecasting methods on active (which evaluate possible consequences of the decisions made) and passive (which provide a forecast of changing a state under current conditions). It is proposed to use passive forecasting for active type feedback formation. It allows a user to form control actions in advance taking into account the forecast of a situation development.

7. Time delay automatic control systems: robustness, response time, synthesis [2017-03-13]
Authors: V.D. Than, D.Yu. Berchuk
Visitors: 2657
The article considers synthesis of time delay automatic control systems (ACS). A synthesis statement is connected to equation solving by a numerical technique without approximation of a time delay element. This issue has a potential to improve synthesis accuracy of regulators. A real interpolation method (RIM) has number of advantages in synthesis of time delay ACS. RIM allows finding a regulator that compensates time delay using a targeted model, a defined non-approximated mathematical model of object and a regulator structure. Moreover, RIM allows building transfer function with direct quality factors on the stage of developing a targeted model. It’s simpler and more useful for a developer. The article demonstrates potentials of the method using minimal quantity of factors: overshoot and settings time. The targeted model with a defined non-approximated mathematical model and a regulator structure allows building accurate synthesis equation, which is useful for numerical research of synthesis and robust properties of time delay systems. The article proposes an algorithm of a maximal control rate for a time delay system in terms of specified overshoot. It also shows a negative effect of increasing time delay to robustness of the system. A result is graphically illustrated and proved by a number of examples.

8. On an approach to construct asset master data management system [2017-03-13]
Authors: A.A. Sukhobokov, V.I. Strogonova
Visitors: 2348
The paper describes capabilities of current MDM (Master Data Management) solutions and prospects of multidomain and multivector MDM solutions. The paper presents the reasons for which MDM of asset data solutions for single data domain are not used successfully in contrast to the existing MDM solutions for other data domains such as customers, suppliers, products, employees, etc. There are challenges of combining several different representations of the same assets in MDM of asset data solution. The conclusion shows that as long as relevant single-domain MDM of asset data solutions are not developed and not implemented successfully, it is too early to move this subject area to multidomain systems. To solve the problems described above, the authors propose a model of assets master data, which enable to combine different representations. This model includes multiple independent hierarchies for different representations of the same assets, non-hierarchical links specific for each subject area, grids of links allowing to go between different representations of the same asset, a set of asset classifiers whose classes define sets of attributes for describing assets, classifiers of links between assets, as well as structural and functional models for individual asset types. In order to implement the proposed model of master data on assets, the authors have developed a special architecture of MDM of asset data solution, as well as an algorithm for checking the integrity of links between different representations across the whole data model. Key requirements are defined to the tools for developing a prototype of MDM of asset data solution. It must provide the functionality of a graph DBMS and at the same time a graph engine to perform complex algorithm on the graph as a whole.

9. System models construction based on LTL formula equational characteristics [2017-03-13]
Authors: Korablin Yu.P., Shipov A.A.
Visitors: 2258
Verification of software and technical systems has always been and still remains one of the most significant tasks since appearance of first computing devices. Today there are quite a lot of approaches to solve this problem. However, development of such formal verification method as Model Checking helped to solve the problem of verifying systems representation and to unify verification process for software and technical systems. Its main idea is to transform an original system into a unified form. It means that a verification process requires only a model that would most precisely describe system’s behavior. The article considers the possibility of system model construction using RLTL-notations (Recursive Linear Temporal Logic), which are a recursive representation of formulas of linear temporal logic. However, its usage is not limited to this aspect. The advantage of using RLTL for these purposes is that models based on it might be verified in respect to requirements which are also RLTL-based without casting to any another data structure. It will certainly help to simplify and improve the performance of a verification process. Furthermore, the article describes the formal tools, which allow simplifying RLTL-based models in many cases reducing the number of their states and transitions.

10. Using Bayes' theorem to estimate CMMI® practices implementation [2017-03-13]
Authors: G.I. Kozhomberdieva, D.P. Burakov , M.I. Garina
Visitors: 2386
The article is devoted to the expert estimation methodology (based on objective evidence) for appraising the extent of implementation of practices, which ensure achievement of the goals of CMMI® model process areas. The model has been developed by the Software Engineering Institute (SEI) at Carnegie Mellon University. Such appraisals are necessary to understand the software development processes maturity level in a developer company. In case of uncertainty and/or incompleteness of information on CMMI® practice implementation, it is reasonable to use a toolkit for decision-making in weakly formalized subject domains. It helps to increase a degree of belief to decisions of appraisal team members. In the previously published work, the authors have considered two approaches to construction of the estimation: fuzzy logic methods and multi-criteria classification methods. This article makes an attempt to make the appraisal procedure even more simple and flexible, to expand the opportunities for its use and to increase its objectivity. The proposed approach is based on the known Bayes' theorem. An extent of CMMI® practice implementation is estimated via the distribution of probabilities on a set of hypothesizes. Each of hypotheses assumes that an implementation level reached one of predefined ones. The Bayesian estimation of a practice implementation extent is understood as a posteriori probability distribution, which is revised and refined during the estimation. Values of conditional probability that are used when calculating the Bayesian estimation, show how much hypothesis on a practices implementation level are supported by the obtained objective evidences.

| 1 | 2 | 3 | Next →