ISSN 0236-235X (P)
ISSN 2311-2735 (E)


Next issue

Publication date:
16 June 2019

Articles of journal № 1 at 2014 year.

Order result by:
Public date | Title | Authors

1. Models as basic information architecture artifacts [№1 за 2014 год]
Author: test_1
Visitors: 8275
Information and knowledge have long been associated with the key factors designed to ensure reliable opera-tion and the organic development of any economic system. The same role is assigned to the information model and data models in the «information architecture» domain, the most substantial component in its overall architectural description. In this article the structure, content and highlights of building information models, data models and other information elements of the economic system are discussed in the context of the overall architectural design process. Ultimately, the co ntent is in-tended to help readers to understand the «information architecture» interaction domain with other domains of the overall a r-chitecture of the economic system – the business architecture, organizational structure, information applications, technology infrastructure – and to understand its key position in this interaction.

2. Ant colony method application to implement cryptanalysis of block cryptosystems [№1 за 2014 год]
Authors: Chernyshev Yu.O., Sergeev A.S., Dubrov E.O., Ryazanov A.N.
Visitors: 7163
The authors investigate the possibility of application of ant colony algorithms for block system cryptanalysis. The reason is that a transition to block enciphering allows increasing firmness of cryptographic algorithms considerably. Dis-tinctive features of ant algorithms are noted considering classical genetic algorithms. The article shows how the problem of a private key search can be reduced to a classical task of appointments which is solved with ant colony algorithm. There is a cryptanalysis problem algorithm, a block diagram of cryptanalysis of DES 2nd type, and an example of implementing cryp t-analysis algorithm. In this example, based on a code text block, an initial text block and a private key with the base of a le t-ters combinations matrix are defined. It is also considered that each bit of a code text is defined by each bit of the initia l text and each bit of a key, and that both the initial text and code text contain symbols from identical alphabets.

3. A model for design of ensembles of intelligent information technologies for detecting information security incidents [№1 за 2014 год]
Authors: Bukhtoyarov V.V., Zhukov V.G.
Visitors: 5905
The development of effective methods to detect information security incidents is an urgent problem. The im-portance of this problem is determined by current trends in communication in information systems and by security require-ments for such systems. One of the trends is using intelligent information technologies as basic tools for solving this problem. These intelligent information technologies also include artificial neural networks proved their efficiency when solving such problems as classification, modeling and forecasting. Due to the common trends of information systems, the so -called en-semble approaches became more popular for solving data mining problems. They allow processing the information in parallel by several neural networks to obtain more effective solutions. The authors propose to use a three-step evolutionary approach to detect information security incidents. The results of experimental studies of the proposed approach on a KDDCup'99 data set are presented. The paper also considers using individual neural networks in the case of individual classifiers distribution – the so-called ensemble-distributed approach. The authors propose a method for determining cases when the problem to be solved by individual neural network and when the entire ensemble of neural networks is used. Ensemble-distributed method efficiency is tested on the problem of detecting information security incidents. The ways for further studies of the proposed methods are marked.

4. A model of the information security events choice based on fuzzy automata [№1 за 2014 год]
Authors: Borisov V.V., Goncharov M.M.
Visitors: 6007
The article is devoted to solving the current problem of the choice of security measures for various information sys-tems. This paper considers generalized fuzzy automata and adaptation of their structure to solve assigned tasks. A model of c hoice of measures to ensure security based on the fuzzy automata and a method of its construction is suggested. The model is a kind of fuzzy automata, which can be simultaneously in several states. It allows considering the simultaneous effect of several events , as well as their interactions with each other and the impact on the system parameters over time. A phased method of constructing fuzzy m a-chine according to the model requirements is proposed. A method for selecting actions based on the proposed model is presented. The paper also shows the application of the method for evaluating interventions based on the described model. The proposed model and the method for choosing information security measures can provide adequate description of the interaction between events and system concepts that affect the risks for a variety of information systems under uncertainty, the ability to analyze a simult aneous im-pact of measures from various groups (affecting the threats and vulnerabilities of the system) on the resulting risks level. Besides, the proposed model allows assessing the implementation of measures in stages, as well as considering the impact on information r e-sources over time. The developed model and method can be used in designing decision-support tools in the feild of information security.

5. Information-simulation modeling system architecture for IT-infrastructure life cycle support [№1 за 2014 год]
Author: Grishakov V.G.
Visitors: 8535
The article suggests the architecture of information-simulation modeling system for IT-infrastructure manage-ment support. Because of organization distribution, this modeling system can be used for management support in IT-departments. It is a base platform for development monitoring, estimation, decision support and management realization sys-tems. This modeling system supports unified hybrid IT-infrastructure model that includes several particular models: IT-infrastructure technical component model, a model of administration man agement tasks and goals, a model of IT knowledge domain. A unified hybrid IT-infrastructure model consists of informational and simulation models. Data from informational model is used for simulation model, its results in complementing informational model. A hybrid modeling system consists of informational and simulation modeling systems. The authors suggest using virtual system of hybrid modeling as a base pla t-form for IT-infrastructure modeling system development.

6. Numerical particle-in-cell code for CPU-GPU hybrid systems [№1 за 2014 год]
Authors: Popova N.N., Nikishin N.G.
Visitors: 5234
The paper presents efficiency analysis of parallel programs for ions evolution analysis in particle-in-cell model. Parallel programs for hybrid computing systems with CPU (Central Processing Units, central processors) and GPU (Graphic Processing Units, graphic accelerators) devices were developed. The programs are applied for ions behavior direct modeling in mass spectrometers traps based on Fourier transform. The paper demonstrates the possibilities of using GPU for speeding-up repeated Poisson’s equation with Dirichlet boundary conditions solution, which is based on Fast Fourier Transform, that is implemented in cuFFT library – FFT procedures library for CUDA (Compute Unified Device Architecture). Effective performance and memory bandwidth are compared with GPU peak characteristics for different devices. The chosen algorithm scales according to its complexity estimate. The programs for trapping fields calculation between random shaped electrodes on hybrid systems with simultaneous CPU and GPU data processing were developed. Every parallel process of the field calculation program solves a system of linear equations using GPU with LAPACK procedures that are implemented in CULA library. Calculations on “Lomonosov” supercomputer show that parallel GPU usage efficiency significly depends on choosing of parallel program process distribution nodes. Accelerators can be efficiently used for finding coulomb interaction between ions with solution of Poisson’s equation boundary problem. Parallel computing of fields that inducted by every single surface on CPU can be done when solving linear equations systems on GPU achieving good efficiency.

7. Modeling agents interaction in a multi-agent system using the apparatus of coloured petri nets and fuzzy logic [№1 за 2014 год]
Author: Vladimirov A.V.
Visitors: 6750
The article identifies problems in functioning of major adaptive information systems, as well as the basic re-quirements. It allows forming a structure of an adaptive information system and highlighting its main segments. The paper al-so shows functions of system components and describes its action. A structure of agent-based component of the adaptive in-formation system with the consideration of the functions of agents is obtained. Using the apparatus of colored Petri nets the author models the process of agents’ interaction in a multi-agent adaptive information system. A basic set of colors is shown, the main model components are considered, model work at different stages is researched, an analysis of the main systems properties is made. The article describes tasks processing by systems agents, basic states of agents used in processing tasks, queues mechanisms in system functioning. A description of the agents is shown like a tuple. A model of construction quality evaluation is considered. A prototype of fuzzy inference system is built based on obtained common set of production rules, optimized to enhance the sustainability and adequacy of the model, as well as certain fuzzy linguistic variables. A term set of linguistic variables are considered. The construction quality estimation based on input data is made. The effect of variations of input fuzzy linguistic variables for the output variable is analyzed. The three-dimensional visualization of the quality as-sessment model is obtained and analyzed.

8. Using data compression and double caching to increase client-server applications operating efficiency [№1 за 2014 год]
Authors: Evseenko I.А., Melnikov I.I., Demidenkov К.А.
Visitors: 5841
Many organizations use client-server applications in order to afford their employees and clients an opportunity to work with necessary information on their local computers. Delays of networking and data processing on the server side must not aggravate this process. The problem of accelerating client-server applications with caching and data transfer rate in-crease had not existed until client-server applications had became popular in wide area networks. But the problem of granting fast access to a central server and accelerating client-server applications has become important because of enterprise expan-sion, creation new service centers and remote desks in different regions of the world. The double caching (caching on the cli-ent side and on the server side) with periodical client/server cache update and the additional arrangements for accelerating da-ta transfer (compression and encryption) were proposed to solve the problem. The WAN accelerator has been developed based on this method. It is an independent program module. It is transparent for a client-server application that uses it. The accelerator contains two parts: the client part (accelerator-client) and the server part (accelerator-server). The TCP connection is established between the parts. The accelerator-client intercepts client HTTP requests, caches, compresses, encrypts them and then sends them to the accelerator-server. The accelerator-server receives the requests, decrypts, decompresses, caches them and then sends the restored HTTP requests to the server. The server processes the requests and generates the HTTP r e-sponses. The accelerator-server intercepts the responses, caches, compresses and encrypts them and after that sends them to the accelerator-client. The accelerator-client receives the responses, decrypts, decompresses and caches them and after that sends the restored HTTP responses to the client. If the identical HTTP request is intercepted by the accelerator-client, the HTTP response will be extracted from the accelerator-client cache (it is the best case because there is no need to send the HTTP request to the server and wait for the HTTP response) or from the accelerator-server cache (it is the worst case because there is a need to send the HTTP request to the accelerator-server and wait for the response from its cache but there is no need to send the HTTP request to the server and wait until the response will be generated). Test series were run to evaluate operating efficiency of the accelerator. The client and server mocks were used for this purpose. The client mock was sending the test HTTP requests and the server mock was receiving the requests and generating the test HTTP responses. The accelera-tor was intercepting the requests and responses. The HTTP requests and responses had the same structure like the HTTP r e-quests and responses used by the real client-server applications in the target organizations. The test series demonstrated that usage of the accelerator could decrease response time for 14–98 % depending on network bandwidth and identical re-quest\response repetition rate. The user could see that the client-server application had been working faster.

9. Analysis of data collection using logic-set histogram representation [№1 за 2014 год]
Author: Papulin S.Yu.
Visitors: 5518
The article provides information about analysis of a data collection using logic-set histogram representation. This type of representation is based on using a histogram and a special mathematical system that allows implementing user queries as statements with logic and set operations. A result of the analysis is a quantitative value of presence of elements (designated in a query) in a data. Moreover, it’s possible to use a sample query to find quantitative values of similarity to an analyzed data. A collection is defined as a list of one-type data (for example, textual documents, images, video or others), which is made up of a universal set elements. Each element of a collection matches with its histogram representation. The paper considers paper two approaches of analysis of data collection using the logic-set histogram representation. Both of them are built on element-by-element analysis. As a result of analysis of a data collection, the author defines a list of quanti-tative values of presence for element queries and values of similarity for sample queries that can be arranged in an order con-venient to a computer user.

10. Intelligent procedures of technological processes design in integrated cad-cam systems [№1 за 2014 год]
Authors: Semenov N.A., Burdo G.B., Isaev A.A.
Visitors: 6288
The efficiency of synthesis procedures in computer-aided design systems (CAD systems) of machining opera-tion processes is determined by a system hierarchical structure, choice validity of the of input and output information flows in its elements (subsystems), input-output data converting rules and possibility of dynamic response to a production situation in machining workshops. For this purpose the authors have carried out the research of an intelligent CAD system for the multi-product machinery production with a computer integrated manufacturing control (in view of the CAM system approach). A set-theoretic approach is applied in the research. A set of elements is assigned to a CAD system, with each element compl y-ing with its function and links, some of them are depend on time parameter. The composition and hierarchy of CAD subsystems are given. Subsystems implementing the learning curve and best practice concepts in machine processing procedures along with functions of solution selection criteria on the levels of solution synthesis decomposition as well as data and tem-poral connections to CAM are referred to specific features of a CAD system in question. The article considers decomposition levels of technological processes design and the associated CAD system functions within every le vel. The analyzed approach and rules of data conversion were used in the development of software tools for a computer-aided design of technological processes in the one-off and limited production environment. They are tested at one of the enterprises of Tver, Russia.

| 1 | 2 | 3 | 4 | Next →