ISSN 0236-235X (P)
ISSN 2311-2735 (E)

Journal influence

Higher Attestation Commission (VAK) - К1 quartile
Russian Science Citation Index (RSCI)

Bookmark

Next issue

2
Publication date:
16 June 2024

Articles of journal № 4 at 2014 year.

Order result by:
Public date | Title | Authors |

1. Adaptive fuzzy systems on FOREL class taxonomy [№4 за 2014 год]
Authors: V.P. Shchokin, Chernyi, S.G., A.S. Bordug
Visitors: 11128
The article presents the results of developing a method of neuro-fuzzy self-organization structures in intelligent proc-ess control systems. The proposed modification of the basic algorithm can improve the control performance index of intelligent automated control systems at a reduced volume of calculations and corresponding increase in system performance. In classical train-ing rule fuzzy neural networks analyze the number of production rules, membership functions type, fuzzy inference algorithm type, etc. In the case of incorrect choice of these parameters fuzzy neural networks can be ineffective in the automation field. Th e devel-oped algorithm operation is based on the theory of sampling frequency and training frequency distribution. In the theory of control systems with discrete time it is determined that the sampling time T is usually selected according to the following rule of t humb: the value must exceed the maximum frequency of the system. In traditional adaptive control systems parameters are adjusted once e very sampling period, thus the sampling rate and update rate are not separated. We propose an expert method determining the concentra-tion coefficient of membership functions and sampling limits for further adjustments to the base of adaptive -established rules in or-der to reduce the algorithm running time and improve its efficiency when performing parametric synthesis of asymptotically stable intelligent control systems. According to the developed technique it is possible to compare the membership function parameter s which have been obtained as a result of the work of the modified adaptive algorithm of the Wang-Mendel fuzzy network, and the doubling parameters obtained from statistical processing of information systems solutions of the dynamic object state identif ication. We developed a systems operation algorithm taking into account the developed method of self-organization of neuro-fuzzy struc-tures based on the FOREL class taxonomy algorithm.

2. Planning algorithms of aviation materials fatigue tests [№4 за 2014 год]
Authors: Agamirov L.V., Agamirov V.L., Vestyak V.A.
Visitors: 9029
Dissipation of characteristics of construction materials fatigue properties is caused by a number of objective factors. But this problem can be solved through developing and implementing into engineering practice effective mathemati-cal methods of analyzing experimental information. This paper considers optimum planning of fatigue tests. The problem of such tests optimization is important because of their expensiveness for aviation industry, so when planning the minimum number of tests their objectives must be taken into consideration. At the same time the developed statistical analysis metho d-ology is invariant to the type of tested objects. It allows increasing accuracy of defining design characteristics of endurance and fatigue limit, considerably reducing the number of fatigue tests necessary for achieving required accuracy, thus reducing their time duration and cost. The use of the existing worksheet of exact values of noncentral distribution fractiles, as well as mathematical calculation of exact values of fractiles is connected with high labour intensity of computational procedures. This paper contains new algorithms of test planning in fractile estimating and exact calculation of Student fractile of noncentral distribution. It allows offering simpler and perfect software products (considering the speed of calculations) in comparison with offered earlier.

3. Tool environments for production optimization and planning based on evolutionary metaheuristics [№4 за 2014 год]
Authors: Afonin P.V., Malikova A.A., Sashilina K.S.
Visitors: 13396
This article presents the tool environments based on evolutionary metaheuristics to solve the tasks from three subject domains: optimization of material cutting with a dynamic orders portfolio, modeling and optimization of deliveries chains warehouse policy and project schedules optimization. The paper considers features and methods for solving optimiza-tion and production planning problems. It describes the efficiency of the approach to the development of the modified metaheuristics operators that include the optimization problem constraints. The paper considers the class of optimization problems when the objective function is calculated using the simulation method. The environment of cutting optimization in-cludes the optimization module on the basis of genetic algorithm and laying algorithm (for kitchen units). The user sets: a unit size and their number, color film for pasting unit, urgency of the order and desired utilization of the material. The system outputs are the automatically generated cutting cards. The environment of warehouse management supports multi-inventory systems simulation including two-echelon systems and optimization of model variables using evolutionary strategy. The op-timized parameters are critical levels of ordering goods and orders volume. The criteria includes storage costs, organization costs supplies and penalties for shortage of warehouse goods. The environment of design schedules supports the optimization on the basis of genetic algorithm which includes crossover and mutation operators that provide obtaining the solutions, satis-fying to problem constraints.

4. Development of algorithmic data processing tool of planned three-factor experiment to calculate a statistically adequate mathematical model of concrete strength [№4 за 2014 год]
Authors: Belov V.V, Obraztsov I.V., Kuryatnikov Yu.Yu.
Visitors: 7825
Functional modeling is widely applied in technology of building composite materials alongside with physical (manufacturing of the material physical sample) and structurally-imitating (imitation of system structural el-ements interoperability) modeling. The result is reception of the certain mathematical function that describes research object behavior abstracting from internal structure of a material substratum. The authors develop the program of B-D13 three-factor planned experiment data processing. It allows counting the plan and processing experiment results. The program algorithm includes a procedure of calculation of response func-tion factors, statistical processing and mathematical model visualization. All basic calculations are made cyclically that allows reconstructing a mathematical model abruptly, changing input data. Besides, the algorithm includes the auxilia-ry procedure providing input data syntactic check. The program corrects user actions using text notification when there are input data errors. The program has additional functions of data loading and conservation, as well as the functi on of exporting calculations results to Microsoft Excel where the user can make additional developments. The software product interface is implemented in the form of logical blocks that allow entering initial data and changing parameters of mathematical model output in interactive mode. The developed software product is applied to a planned experiment on research of concrete mixture parameters in-fluence on properties of the filled fine-grained concrete. The authors received a statistically adequate mathematical model of concrete strength. The developed software can be used in any scientifically-applied problems to optimize re-search object properties, select compounding and technological parameters where mathematical modeling is used by an experiments orthogonal planning method.

5. On one method of blood cell classification and its software implementation [№4 за 2014 год]
Authors: Belyakov V.K., Sukhenko E.P., Zakharov A.V., Koltsov P.P., Kotovich N.V., Kravchenko A.A., Kutsaev A.S., Osipov A.S., Kuznetsov A.B.
Visitors: 15854
A method is offered to classify the leukocytes, erythrocytes and thrombocytes. This method is based upon a comprehensive study of various segmentation methods of microscopic images and algorithms for calculation of blood cell feature sets. Our approach assumes the application of an improved combined segmentation method for microscopic images, the use of an optimized feature vector of an object and a neural network classifier. The important role in the design of our segmentation method belongs to the EDEM method for a comparative study of image processing algorithms developed in SRISA RAS. The segmentation method includes such steps as edge detection, contour closing and over-segmentation elimi-nation (based upon a set of features calculated for each initial segment). For the edge detection we use a combination of the classical Canny detector and the Ritter-Cooper method designed for blood cell segmentation. This combination comprises the advantages of both algorithms. For the boundary enhancement and contour closing steps we use an approach based upon the graph theory which develops the adaptive contour closure algorithm proposed by Jiang. The over-segmentation elimination is an iterative procedure. Our segmentation method is suitable for both red and white blood cell segmentation. To solve the blood cell classification task by a feature set we use a neural network of a multilayer perceptron type (three-layer feedforward neural network with a sigmoid function in the hidden layer). The neural network classifier allows one to effectively separate the cells into different types used in practical hematology. The program library, where the proposed classification method was implemented, is created. Our tests with various blood smear images have shown a high potential of our method for prac-tical application.

6. Tracing and selfhealing in POSIX-systems [№4 за 2014 год]
Authors: Bombin A.A., Galatenko V.A., Kostyukhin K.A.
Visitors: 8974
This paper formulates a definition of the controlled execution original concept and motivates the importance of this concept when creating complex systems. Controlled execution is a specially organized process of hardware and software system functioning. This system is intended to perform its tasks despite errors, attacks and failures. The basics of controlled concept execution are: integration of information security, debugging and management tools; distribution of controlled execution for all phases of system life cycle; integrity of the controlled execution tools, differing in an impact degree on the target system, the possibility of interactions between these tools. Special cases of controlled execution are: information systems controlling; interactive debugging; system monitoring; system self-control; playback the previous sessions of the systems; modeling, collection and analysis of quantitative characteristics of systems; system selfhealing. Taking in the context of controlled execution, the authors propose a POSIX-systems selfhealing technique based on the POSIX trace mechanism. There is a brief review of a trace mechanism described in POSIX-2001. The paper proposes a technique of software systems selfhealing based on this mechanism and integrated into controlled execution concept. POSIX-2001 fixes the minimum functionality of tracing tools, which should be provided by a POSIX-compliant operating system. POSIX-2001 standard refers tracing as collection, accumulation and analysis of data on the events that took place in the user application operation. The work includes an example which can be useful in the practical application of selfhealing methods.

7. Ability measuring researches of a fpga hardware module for load testing tasks [№4 за 2014 год]
Author: Borodin A.A.
Visitors: 7800
Quality and reliability assurance of information system is an important task. Nowadays it is mostly solved using load testing. Constant progress of information technologies requires increasing testing efficiency. Load testing is a complicated process that includes many stages. Theory and practice analysis showed that researcher’s attention to test launching is not enough. Efficiency of this stage depends on loader program performance quality and computer characteristics. Practically, resources of a single computer are not enough to produce the required amount of load. That is why there are load -creating methods based on distributed computing which have disadvantages. These methods include cloud and cluster computing as well as a group of computers connected together using local network. This paper presents experimental results of hardware loader characteristics measuring, created by the author based on FPGA to increase load creation stage efficiency. This device allows load testing process without using additional computers. During experiments, maximum load capability of a created prototype has been determined. The results were compared with characteristics of existing computer systems.

8. Automatic feature selection system for human emotion recognition in speech communication [№4 за 2014 год]
Authors: Brester C.Yu., Semenkin E.S., Sidorov M.Yu.
Visitors: 7862
During the human-machine communication a number of problems related to voice processing should be solved. In addition to the speech recognition problem, there are several important issues such as a speaker, gender or age identification and speech-based emotion recognition. The amount of acoustic characteristics extracted from the signal is tremendously high (hundreds or even thousands): features may correlate with each other, contain noisy data or have low variation level that decrease the accuracy of involved classifiers. Therefore it is vitally important to select informative features automatically during the recognition process. This paper considers two feature selection techniques. Both of them are based on using the self-adaptive multi-objective genetic algorithm that is adjusted while the problem is being solved. The main advantages of this heuristic optimization procedure are the simplicity of coding the informative feature subsystem and the opportunity to optimize both discrete and continuous criteria. The probabilistic neural network is used as a classifier. Effectiveness investigation of the developed approaches was conducted on the set of emotion recognition problems: data bases contained speech signals in English and German languages. During the experiments it was revealed that application of the described feature selection procedures might lead to increasing of the classification accuracy (relative improvement was by up to 22,7 %). Moreover, it became possible to reduce the dimension of the feature vector significantly (from 384 to 64,8 attributes at the average). The proposed schemes demonstrate higher effectiveness compared with Principal Component Analysis. The described methods might be applied for solving the speaker identification problem, recognizing speaker’s gender, age or other personal characteristics that also implies the opportunity to use them as the algorithmic core in the intellectual modules of dialogue systems.

9. Rear-projection method in visualization subsystem of training simulation system [№4 за 2014 год]
Authors: Giatsintov A.M., Mamrosenko K.A.
Visitors: 10016
Nowadays in many industry sectors (for example, in aerospace industry) there is a shortage of qualified specialists, capable of controlling complex technical systems. There is a need of a massive increase of training centers efficiency, particularly with adaption of new training methods and developing of more effective training complexes. Using multimedia technologies in training-simulation systems development allows creating training materials data banks that include images, texts accompanied by audio, video and visual effects, interactive interface; provide an effective way of using collected information for distant and distributed training. One way of using multimedia data in training simulation systems is to insert a graphical representation of instructor into virtual environment. In order to correctly visualize an image of instructor in a 3D virtual scene a method of rear-projection has been developed. It is based on 3D keying. Its main function is to separate an object from a uniform background. This process can be described as a process of creating a mask that contains information about image translucency that separates an object from other parts of an image. Any keying method is resource-intensive, so processing of large images on CPU can lead to performance problems of visualization subsystem, while one of the requirements for visualization subsystem is real-time operation. Considering this requirement, a realization of keying algorithm uses resources of the video card to process images.

10. Algorithm and software calculations of biotechnological processes with substrate and biomass recirculation [№4 за 2014 год]
Authors: Gordeeva Yu.L., Shcherbinin M.Yu., Gordeev L.S., Makarov V.V.
Visitors: 6816
This article discusses a method of increasing the efficiency of a wide range of anaerobic microbiological pro-cesses by introducing continuous cultivation of microorganisms circulation to a scheme. Improvement of continuous process operation biotechnological processes is possible with the use of substrate and bio-mass. Recycling allows increasing the amount of flow and process productivity on the target component. At the same time re-cycling imposes additional constraints on the initial data assignment process. They are an initial substrate concentration, size and magnitude of the volume flow recirculation rate which should be carried out in the algorithm and the program in accord-ance with kinetic characteristics. Based on kinetic relations, the authors have developed an algorithm and software implementation to calculate anaerobic biotechnological process indicators with recirculation of the substrate and biomass provided complete separation of the prod-uct from the recycle stream. The relations provide an opportunity for practice. D R circulation flow rate has been measured by the efficiency of the QP intended product. The algorithm includes 21 a calculation variant of process implementing, each of them can be evaluated in the introduced flowchart. Except for calculation of process indicators, the program contains a user certificate including a description of the varia-bles and a brief program manual. There are numerical results of calculations for different initial tasks of technological parameters. They prove that the in-troduction of circulation results in increased productivity of the QP target product (compared to the same circuit and the input parameters without recirculation flow). A distinctive feature of the approach is the suggested process diversity. It allows assessing the positive effect of including recirculation to the scheme without experiment, just using a software product. Obtained relations and software implementation of the calculations can serve as a working tool for the design process and the basis for further developments in this field, in particular to assess the transient regimes and stability of steady states.

| 1 | 2 | 3 | 4 | 5 | Next →