ISSN 0236-235X (P)
ISSN 2311-2735 (E)

Journal influence

Higher Attestation Commission (VAK) - К1 quartile
Russian Science Citation Index (RSCI)

Bookmark

Next issue

2
Publication date:
16 June 2024

Articles of journal № 1 at 2016 year.

Order result by:
Public date | Title | Authors

1. Comatch. A search engine for transcription factors cooperative binding sites [№1 за 2016 год]
Authors: Nikitin S.I., Cheremushkin E.S.
Visitors: 9070
Recognition of short sequences called transcription factor binding sites is one of the most important problems in bioinformatics. Transcription factor binding sites are short sequences located in DNA regulatory areas and play a key role in transcription process, which is a basic element of every living organism. About 100 algorithms are developed to solve this problem and a number of algorithms is still growing. However, there is no universal algorithm due to many factors that influence binding. For example, in complex living organisms transcription factors are joining into complexes during binding to DNA. Here we present a novel algorithm, which predicts statistically overrepresented transcription factors binding sites pairs. In this case the first site in a pair belongs to initially known fixed transcription factor and another factor should be found. The method uses two kinds of input data: an experimental sequence set and a background sequence set. It searches for significant difference between sites in experimental and background sets. As a result, a user obtains the list of binding sites pairs with P-values, which characterizes the probability to get a pair accidentally, and FDR (False Discovery Rate) is calculated for every pair. In addition, the authors developed a P-value cor-rection option for datasets overrepresented by the anchor matrix binding sites. In this case, dependence between P-value and the selected anchor matrix is made invisible. As a result, the significance of obtained results increases.

2. Modeling of the time-cycle control system for technological processes equipment in pharmaceutical solid dosage manufacturing based on Petri nets [№1 за 2016 год]
Authors: Окаи Д.Э.Я., Klyushin A.Yu., Bogatikov V.N.
Visitors: 7061
The key purpose of this research is to develop an approach to modeling batch manufacturing of a system of pharmaceutical solid dosage forms using a mathematical tool called Petri net. This possibility can be provided by a model that can be easily understood by a System Analyst and Technological engineers. This paper describes a system analysis and a model of manufacturing processes of pharmaceutical solid dosage forms with much emphasis on tablets. It also provides a brief classification of technological processes within the manufacturing system of solid-based dosage forms. The system analysis was done with much focus on manufacturing processes and equipment used at each stage of a manufacturing system. The paper briefly considers the development of a Petri net model for manufacturing processes of pharmaceutical solid dosage forms. A Petri net model was implemented and analyzed using the PIPE tool (Platform Independent Petri net Editor 4.3) for Windows The authors also developed and analyzed a timeline chart for the analysis of technological routes for manufacturing of three different solid dosage forms. The resulting Petri net model and the timeline chart helped to create a clearer picture of the technological processes in the manufacturing system. The Petri net model developed in this paper can be used to solve the problem of choosing an optimal manufacturing model or optimal technological routes. It can also be used to improve the efficiency of using a set of equipment in the manufacturing system of pharmaceutical solid dosage forms.

3. Review of studies on time series forecasting based on hybrid methods, neural networks and multiple regression [№1 за 2016 год]
Authors: Yarushev S.A., A.N. Averkin
Visitors: 7159
The article gives a detailed overview of the studies in time series forecasting. It also considers the history of forecasting methods development. The author gives a review of the latest valid forecasting methods, such as statistical, connectionist and hybrid, forecasting methods that are based on multiple regression, their basic parameters, application area and performance. The paper considers recent research in the field of hybrid forecasting methods application, gives a short overview of these methods and notes their efficiency according to the authors. The author emphasizes the study of using BigData in forecasting. He suggests a forecasting model based on BigData technology using a hybrid of soft computing and artificial neural networks, he tests it on a stock market. The article considers a model based on neural networks, wavelet analysis and bootstrap method. The method is developed for flows forecasting to manage water resources successfully. The paper shows a detailed comparative study of methods based on neural networks and multiple regression. It considers different studies with a description of comparison methods and results. It also shows a comparison of these methods on the example of predicting housing market; there is a detailed analysis of both methods using different samples. At the end the author gives the results of the study and compares forecasting results.

4. Adaptive landmark selection in shortest-paths problem for a big graph [№1 за 2016 год]
Authors: Bykova V.V., Soldatenko A.A.
Visitors: 8623
The shortest path (SP) problem is one of the main problems in routing in the theory of graphs. This problem arises in a web-structure analysis when creating navigation systems, traffic modeling and logistics optimization. The problem requires a search of the shortest path between two given vertices of the initial directed graph G and minimization of the sum of edges weight forming this way. Traditionally the SP problem is solved using Dijkstra's algorithm. The algorithm works assigning marks to vertices of graph G and uniformly expands the search space of solutions, starting from the start vertex s to the target vertex t of the graph. There are a lot of modifications of Dijkstra’s algorithm aimed to reduce working time. A* (A star) algorithm is one of these modifications, in which acceleration is achieved using a potential function defined on the set of G graph vertices. The ALT (A* with Landmark & Triangle) algorithm is based on A* algorithm. Here a potential function is defined by a set of landmarks (a certain subset of G graph vertices). Different landmark selections in G correspond to different potential functions. Selection of the optimal set of landmarks is carried in a finite set of variants and is a NP-hard problem. The ALT algorithm is represented in the form of two phases: the first phase contains execution of preliminary graph processing to select landmarks and calculate a potential function; the second phase computes the shortest (s, t)-path using a potential function. The paper suggests an adaptive heuristic for landmark selection. This heuristic uses exexecution history of last queries to search the shortest paths in G graph and corrects the current set of landmarks for effective execution of a newly received query to search the shortest (s, t)-path in a current graph. In the proposed modification the ALT algorithm and Dijkstra’s algorithm are equivalent in terms of asymptotic estimate of their performance. However, in real life the modified ALT algorithm applied to a big graph is much faster than Dijkstra’s algorithm. This is confirmed by presented in this work computation results.

5. Knoware of an adaptive knowledge testing system [№1 за 2016 год]
Authors: Bessarabov N.A., Bondarenko A.V., Kondratenko T.N., Timofeev D.S.
Visitors: 8731
The article considers the problem of realizing knoware for an adaptive knowledge testing system. The process of testing knowledge is represented as a dynamic one. An organizer forms a homogeneous group of examinees at each cycle of the system. The system selects the best test at each step based on a convergent procedure of stochastic approximation. The best test depends on probabilistic characteristics of examinees contingent and allows increasing estimation accuracy of person parameters. To identify factors that prevent obtaining objective evaluation, such as conversations and copying, the authors introduce an interaction rate. The article analyzes the influense of examinees’ interaction on the accuracy of test marks. To improve the estimation accuracy of person parameters the system overestimates probabilistic characteristics of a group of examinees at each cycle. Person and item parameters are aligned on a common scale at each cycle. The system implements evaluating item parameters using the maximum likelihood method, the conditional maximum likelihood method and the marginal maximum likelihood method. The maximum likelihood method, the weighted likelihood method and the Bayesian approach can be used for evaluating person parameters. The system evaluates a 1-parameter dichotomous Rasch model and 2-parameter, 3-parameter, 4-parameter dichotomous extensions, RSM and PCM polytomous models and appropriate linearized models. To evaluate matching between observed data and the expected values the authors use a statistics of the likelihood test, statistics of the Hosmer-Lemeshow test, coefficients of determination, ROC-analysis. The article contains a flow chart of the algorithm for each cycle and a database schematic diagram.

6. Data analysis and processing to predict patients’ health status [№1 за 2016 год]
Authors: Ivanov S.I., Gordienko M.G., Matasov A.V., Menshutina N.V.
Visitors: 11807
Development of information technologies made people to pay more attention to automated methods of data analysis and processing. This article describes two interconnected methods for data processing: multifactor data analysis using the major components method and neural networks. The work showed a necessity to process data from patients’ tests and to predict a target parameter value through time based on this data. Prediction of a certain target parameter makes medical treatment easier due to doctor’s faster decision-making regarding a way of treatment. The authors used a multifactor analysis method (major components method). It allowed decreasing a problem scale, showed requirements to educational and test sets to design a mathematical model based on a neural network. It also allowed identifying key factors, which can be used as initial parameters for a neural-network model. The article contains load diagrams and graphical interpretation of a correlation between patients’ test values, as well as authors’ conclusions about correlation. Further, a neural network trained using a training sample (an amount of experiments for training was about 200). Training quality was controlled using a test set (an amount of experiments for a test was about 50). The article also contains a comparison of calculated and experimental data. An error of the neural network is 8 %. The authors developed software using C# and Visual Studio to implement the described methods.

7. Equipment layout data insertion in a graph model of ship damage control analysis [№1 за 2016 год]
Authors: Sorokin V.E., Bolotov A.A., Reut E.V.
Visitors: 7808
Ship damage control evaluation is based on the ability of its technical means (ship systems) to withstand combat damage and emergency, ensure combat mission accomplishment and prevent its crash. Estimated functional dependency graph (EFDG) is a mathematical model based on structured-logical schemes of ship system operation performance dependence on the state of the elements of these systems. The graph is designed to estimate ship damage control in combat and operational damages of ship systems. This model logically formally reflects functional dependencies between functionally significant elements of various level of ship system equipment aggregation as they perform functionally independent operations. External simulation conditions on EFDG are set by equipment failure models caused by combat and operational damages. The article proposes including in EFDG the nodes of a ship sections (rooms) status with arcs reflecting equipment placement according to the electronic model of the ship. For a slight increase in model dimensionality, this significantly reduces the total number of original data and the amount of considered variants. Such approach allows performing an effective computational analysis of ship damage control for a single model, when this ship damage control is provided by its systems regardless of equipment failure causes. Thus, analysis computational efficiency allows extending the spatial and functional variability of modeling to determine a ship project with the best damage control assessment.

8. Imitation modeling for aerospace confrontation [№1 за 2016 год]
Authors: Bogdanov O.A., Smirnov A.A., Kovalev D.V.
Visitors: 14337
Armed fight processes possess a complex and multiaspect performance character. When there’s no possibility of the mathematical formalization providing mathematical solving of a task, the only approach to research such processes and system is to use modeling methods, in particular imitation modeling methods. Interactive imitation modeling complexes are very important in the imitation system. They provide an opportunity to research form scenarios for large-scale application of aerospace assault weapon and aerospace defense dispositions in the form of imitation computer experiment. At the end of 90s the 2nd Research Center of the Ministry of Defence of the Russian Federation started developing of an imitation modeling complex “Seliger” based on new information technologies. Nowadays a basic version of this complex is under operation testing. The software is developed using C++ and QT library. As a result, it became a crossplatform software product. Imitation modeling complex “Seliger” provides an opportunity to imitate a bilateral conflict taking into account object systems of both sides, their aerospace defense dispositions and assault weapons. It can also calculate system factors that characterise the efficiency of aerospace defense dispositions. The article also considers the directions of complex development. The main is building up imitation models blocks of surveillance and air attack warning.

9. A respiratory sounds interpreter adaptated to signal recording devices [№1 за 2016 год]
Author: Khaneev D.M.
Visitors: 7390
The article considers the problem of constructing respiratory sounds interpreter programs that are customized for signal recording device parameters. When using neural network technologies for respiratory sounds classification, technical characteristics of electronic stethoscopes are different. Thus, the paper shows that it is necessary to readjust scales for assessing learning sample objects’ features. The paper describes the architecture and features of the software to analyze respiratory sounds records; it highlightes the subsystem for adjusting to a noises recorder, which allows generating an individual set of classification rules for each stethoscope model. These rules are generated by neural-like hierarchical structures; each of them synthesizes concepts of several respiratory sounds classes recorded by the stethoscopes of one model. The class descriptions are created using fuzzy features. The generation of scales for their evaluation is automated. The paper considers the results of system operation with 3 different types of respiratory sounds registration devices (3M Littmann 4100, an original device Pat. 66174 and KoRA-03M1 device) that have different characteristics. The analysis of the results revealed significant differences in the parameters of the classifiers of neuron-like hierarchical structures formed for different respiratory sounds recording devices. However, the generated rules showed similar results (88–93 % accuracy) of respiratory sounds interpreters for each electronic stethoscope model.

10. An information system of technological machines operation analysis [№1 за 2016 год]
Authors: Bolotov A.N., Gorlov I.V., Poletaeva E.V., Rakhutin M.G.
Visitors: 8406
The paper suggests a new approach to a technological machines operation analysis using information technologies. A relevant direction of improving the efficiency of using peat machines (TM) is creation of a state control system depending on specific operating conditions. TM maximum seasonal performance is achieved based on the analysis of technological objects’ models and parametric optimization of the components that are responsible for performance. The automated system of processing machines’ operation analysis was implemented in three stages. To obtain necessary statistical data to fill the model the first step included studying the parameters of technological machines operation for milling mining on the basis of JSC “Vasilevsky moss”. The authors developed an algorithm to determine key operating parameters of a functionality restore system. The algorithm provides the highest efficiency of machines in the peat extraction season. The second phase included development of a peat machines operation simulation model. It consists of series-connected elements, their failure leads to system failure. The problem of peat machines operation analysis was solved on the example of an integrated unit that included all the basic elements of peat milling machines for mining. The third stage of research included a computer experiment. The analysis of its results provides the most informed decisions on affecting the technical condition of technological machines under specific conditions with maximum efficiency. The objective component of a maintenance process is ensured by collecting and processing information about the technical condition of the object of diagnosis based on the identification of shortcomings of units and parts that limit the time to repair.

| 1 | 2 | 3 | 4 | Next →