A Simulink model is considered that allows calculating transient processes of objects described using a transient function for any type of input action. An algorithm for the operation of the S-function that performs calculations using the Duhamel integral is described. It is shown that due to the features of the S-function, it can store the values of the previous step of the Simulink model calculation. This allows the input signal to be decomposed into step components and the time of occurrence of each step and its value to be stored. For each step of the input signal increment, the S-function calculates the response by scaling the transient response. Then, at each step of the calculation, the sum of such reactions is found. The S-function provides a procedure for freeing memory when the end point of the transient response is reached at each step. Thus, the amount of memory required for the calculation does not increase above a certain limit, and, in general, does not depend on the length of the model time. For calculations, the S-function uses matrix operations and does not use cycles. Due to this, the speed of model calculation is quite high. The article presents the results of calculations. Recommendations are given for setting the parameters of the model. A conclusion is formulated on the possibility of using the model for calculating dynamic modes.
Keywords: simulation modeling, Simulink, step response, step function, S-function, Duhamel integral.
The article describes the mathematical foundations of time-frequency analysis of signals using the algorithms Empirical Mode Decomposition (EMD), Intrinsic Time-Scale Decomposition (ITD) and Variational Mode Decomposition (VMD). Synthetic and real signals distorted by additive white Gaussian noise with different signal-to-noise ratio are considered. A comprehensive comparison of the EMD, ITD and VMD algorithms has been performed. The possibility of using these algorithms in the tasks of signal denoising and spectral analysis is investigated. The estimation of algorithm execution time and calculation stability is performed.
Keywords: time-frequency analysis, denoising, decomposition, mode, Hilbert-Huang transformation, Empirical Mode Decomposition, Intrinsic Time-Scale Decomposition, Variational Mode Decomposition
Modern digitalization processes involve the use of intelligent systems at key stages of information processing. Given that the data available for intelligent analysis in organizational systems are often fuzzy, there is a problem of comparing the corresponding units of information with each other. There are several known methods for such a comparison. In particular, for random fuzzy variables with known distribution laws, the degree of coincidence of these distribution laws can be used as a criterion for the correspondence of one random variable to another. However, this approach does not have the necessary flexibility required to solve practical problems. The approach we propose allows you to compare fuzzy, fuzzy and clear, as well as clear and clear data. The paper will provide an example illustrating this approach. The material presented in the study was initially focused on managing organizational systems in education. However, its results can be extended to other organizational systems.
Keywords: fuzzy data, weakly structured problems, comparison criteria, hierarchy analysis method, systems analysis, fuzzy benchmarking
The article presents the main stages and recommendations for the development of an information and analytical system (IAS) based on geographic information systems (GIS) in the field of rational management of forest resources, providing for the processing, storage and presentation of information on forest wood resources, as well as a description of some specific examples of the implementation of its individual components and digital technologies. The following stages of IAS development are considered: the stage of collecting and structuring data on forest wood resources; the stage of justifying the type of software implementation of the IAS; the stage of equipment selection; the stage of developing a data analysis and processing unit; the stage of developing the architecture of interaction of IAS blocks; the stage of developing the IAS application interface; the stage of testing the IAS. It is proposed to implement the interaction between the client and server parts based on Asynchronous JavaScript and XML (AJAX) technology. It is recommended to use the open source Leaflet libraries for visualization of geodata. To store large amounts of data on the server, it is proposed to use the SQLite database management system. The proposed approaches can find application in the creation of an IAS for the formation of management decisions in the field of rational management of forest wood resources.
Keywords: geographic information systems, forest resources, methodology, web application, AJAX technology, SQLite, Leaflet, information processing
With the development of low-orbit satellite Internet systems (NSIS), issues of ensuring effective operation in conditions of intentional interference come to the fore. One of the solutions is related to the use of systems using both OFDM methods and generators implementing frequency hopping (HF). Obviously, the more complex the algorithm for selecting operating frequencies, the more efficient the operation of the microwave. In the article, it is proposed to use the SPN cipher "Grasshopper" as a generator for selecting operating frequencies. As a result, the CCF system will have a high resistance to calculating operating frequency numbers by electronic warfare systems. However, failures and failures may occur during the operation of the SSC. To prevent their consequences, it is proposed to implement an SPN cipher using polynomial modular codes of residue classes (PMCC). One of the transformations in the "Grasshopper" is a nonlinear transformation that performs the substitution operation. It is obvious that the creation of a new mathematical model for performing a nonlinear transformation using MCCS will ensure the operation of the SPN-cipher-based RF generator in conditions of failures and failures.
Keywords: low-orbit satellite Internet systems, the Grasshopper SPN cipher, nonlinear transformations, modular codes of residue classes, mathematical model, fault tolerance, frequency hopping, polynomial modular code of residue classes
More attention is being paid to the transition to domestic software with the digitalisation of the construction industry and import substitution. At each stage of construction, additional products are needed, including CAD and BIM. The experience of integration of Russian-made systems for the tasks of information modeling of transport infrastructure and road construction is considered. Within the framework of the work the integration of Vitro-CAD CDE and Topomatic Robur software system was performed. Joint work of the construction project participants in a single information space was organized. The efficiency of work of the project participants was determined due to the release from routine operations. Integration experience has shown that the combination of Vitro-CAD and Topomatic Robur allows to manage project data efficiently, store files with version tracking, coordinate documentation and issue comments to it.
Keywords: common data environment, information space, information model, digital ecosystem, computer-aided design, building information modeling, automation, integration, import substitution, software complex, platform, design documentation, road construction
In the article, based on the estimate of the Euclidean norm of the deviation of the coordinates of the transition and stationary states of the dynamic system, the compression condition of the generalized projection operator of the dynamic system with restrictions is derived. From the principle of contracting mappings, taking into account the derived compression condition of the projection operator, estimates are obtained for the sufficient condition for the stability of the dynamic system of stabilization of the equilibrium position and program motions. The obtained estimates generalize the previously obtained results. Ensuring the stability of the operator of a limited dynamic system is demonstrated experimentally.
Keywords: sufficient condition for stability, projection operator, stabilization of equilibrium position. stabilization of program motions, SimInTech
Oil spills require timely measures to eliminate the causes and neutralize the consequences. The use of a case-based reasoning is promising to develop specific technological solutions in order to eliminate oil spills. It becomes important to structure the description of possible situations and the formation of a representation of solutions. In this paper, the results of these tasks are presented. A structure is proposed for representing situations in oil product spills based on a situation tree, a description of the algorithm for situational decision-making using this structure is given, parameters for describing situations in oil product spills and presenting solutions are proposed. The situation tree allows you to form a representation of situations based on the analysis of various source information. This approach makes it possible to quickly clarify the parameters and select similar situations from the knowledge base, the solutions of which can be used in the current undesirable situation.
Keywords: case-based reasoning; decision making; oil spill, oil spill response, decision support, situation tree
The purpose of the article is a software implementation of a module for analyzing the activity of site users based on a heat map of clicks, compatible with domestic web services, for example, combining the functionality of correlation and regression analysis and visualization in the form of dashboards before and after making changes to site elements. All functionality is carried out directly in the web analytics service. Based on the data obtained on the analyzed site element, a decision is made to adjust the design and/or content to increase the click rate. Thus, the proposed solution allows us to expand the functionality of the web analytics service and reduce labor costs. The software module has been successfully tested. As a result of the analysis and making the necessary adjustments to the site, the click rate increased
Keywords: user activity, correlation and regression analysis, dashboard, program module, trend line, coefficient of determination
The article provides a review and systematisation of works devoted to the application of machine learning for solving problems of research, calculation and design of reinforced concrete structures. It considers the aspects, which are relevant today, related to calculation, design, as well as assessment of the technical condition of objects with the help of various approaches that implement machine learning schemes, including deep learning, ensemble algorithms. It is shown that nowadays in the world construction science this area is rapidly developing and improving. Thus machine learning algorithms solve problems of prediction of design parameters, problems of identification of these or those parameters, defects, damages on the basis of classification algorithms and others. The materials presented in the article will allow specialists to choose the subject area of research more precisely and determine the directions of adaptation and improvement of their own developments in the field of machine learning.
Keywords: machine learning, reinforced concrete structures, regression equations, identification, approximation, artificial intelligence
There is often a need to analyze unstructured data when assessing the risk of emergency situations. Traditional analysis methods may not take into account the ambiguity of information, which makes them insufficiently effective for risk assessment. The article proposes the use of a modified hierarchy process analysis method using fuzzy logic, which allows for more effective consideration of uncertainties and subjective assessments in the process of analyzing emergency risks. In addition, such methods allow for consideration of not only quantitative indicators, but also qualitative ones. This, in turn, can lead to more informed decisions in the field of risk management and increased preparedness for various situations. The integration of technologies for working with unstructured data in the process of assessing emergency risks not only increases the accuracy of forecasting, but also allows for adapting management strategies to changing conditions.
Keywords: artificial intelligent systems, unstructured data, risk assessment, classical hierarchy analysis method, modified hierarchy analysis method, fuzzy logical inference system
Many modern information processing and control systems for various fields are based on software and hardware for image processing and analysis. At the same time, it is often necessary to ensure the storage and transmission of large data sets, including image collections. Data compression technologies are used to reduce the amount of memory required and increase the speed of information transmission. To date, approaches based on the use of discrete wavelet transformations have been developed and applied. The advantage of these transformations is the ability to localize the points of brightness change in images. The detailing coefficients corresponding to such points make a significant contribution to the energy of the image. This contribution can be quantified in the form of weights, the analysis of which allows us to determine the method of quantization of the coefficients of the wavelet transform in the proposed lossy compression method. The approach described in the paper corresponds to the general scheme of image compression and provides for the stages of transformation, quantization and encoding. It provides good compression performance and can be used in information processing and control systems.
Keywords: image processing, image compression, redundancy in images, general image compression scheme, wavelet transform, compression based on wavelet transform, weight model, significance of detail coefficients, quantization, entropy coding
The paper considers the task of collection and preparation of data coming from several information systems on the example of automation of registrar's reporting. The languages OWL, XML, XBRL and semantic networks can be used to describe the subject area. A set of criteria for analysing and selecting the most appropriate knowledge representation language for the purpose of data collection on the example of financial statements is prepared. The results of service development are described and the application of XBRL format is shown. The multi-agent approach to modelling and design of information systems was used in the development of the service.
Keywords: data mining, subject area model, data formats, XBRL, business process, service, data integration
The work presents the review of modern log trucks under the recent sanctions imposed. The author states that the problem of renewing the existing log trucks becomes urgent for forest transporting and logging companies nowadays. There is a wide range of new basic chassis and trucks at the market to build log trucks with a wheel formula 6x4 and 6x6 produced by Russian, Belorussian and Chinese factories. A great number of trailer links is produced to build log trucks. There is an opportunity to buy used trucks of other companies. For the first stage of the technical and economic analysis and preliminary selection of the optimal type and composition of a logging truck, a comparative assessment of the effectiveness of logging trucks was carried out. The analysis shows that Russian log trucks with engine power more than 400 HP (horsepower) can compete with the best foreign models. Nevertheless, the problem of reliability of Russian, Belorussian and Chinese log trucks needs further research.
Keywords: log trucks, trailer links, productivity, effectiveness
This paper describes approaches to visualization and comparison of semantic trees reflecting the component structure of the patented device and the connections between them using graph databases. DBMS data uses graph structures to store, process, and represent data. The main elements of a graph database are nodes and edges, which, within the framework of the task, model entities of 3 types (SYSTEM, COMPONENT, ATTRIBUTE) and 5 types of connections (PART-OF, LOCATED-AT, CONNECTED-WITH, ATTRIBUTE-FOR, IN-MANNER-OF). According to the results of the study, it can be stated that Neo4j demonstrates the best possibilities for graph visualization; ArangoDB, despite correctly entered queries, performs incomplete visualization; AllegroGraph showed difficult work with code, difficult configuration of graph tree visualization. 3 algorithms for comparing graph representations of information have been tested: Graph Edit Distance, Topological Comparison, Subgraph Isomorphism. The algorithms are implemented in python, compares 2 graph trees, displays visualization and analysis of common graph structures and differences.
Keywords: semantic tree, component structure, patent, graph databases, Neo4j, AllegroGraph, ArangoDB
In systems for monitoring, diagnostics and recognition of the state of various types of objects, an important aspect is the reduction of the volume of measured signal data for its transmission or accumulation in information bases with the ability to restore it without significant distortion. A special type of signals in this case are packet signals, which represent sets of harmonics with multiple frequencies and are truly periodic with a clearly distinguishable period. Signals of this type are typical for mechanical, electromechanical systems with rotating elements: reducers, gearboxes, electric motors, internal combustion engines, etc. The article considers a number of models for reducing these signals and cases of priority application of each of them. In particular, the following are highlighted: the discrete Fourier transform model with a modified formula for restoring a continuous signal, the proposed model based on decomposition by bordering functions and the discrete cosine transform model. The first two models ideally provide absolute accuracy of signal restoration after reduction, the last one refers to reduction models with information loss. The main criteria for evaluating the models are: computational complexity of the implemented transformations, the degree of implemented signal reduction, and the error in restoring the signal from the reduced data. It was found that in the case of application to packet signals, each of the listed models can be used, the choice being determined by the priority indicators of the reduction assessment. The application of the considered reduction models is possible in information and measuring systems for monitoring the state, diagnostics, and control of the above-mentioned objects.
Keywords: reduction model, measured packet signal, discrete cosine transform, decomposition into bordering functions, reduction quality assessment, information-measuring system
At present, continuous tank reactor is widely used in many different industries, and there are many control methods for this reactor. This paper presents a design method for model predictive controller (MPC) based on fuzzy model. The control object is modeled by fuzzy model (Takagi-Sugeno), the optimization problem is solved by genetic algorithm. Using fuzzy models and genetic algorithms to implement MPC controller, it achieved better quality than traditional MPC controllers.
Keywords: method of designing a model predictive controller, fuzzy model, Takagi Sugeno, genetic algorithms, multiple inputs-multiple outputs
In operational diagnostics and recognition of states of complex technical systems, an important task is to identify small time-determined changes in complex measured diagnostic signals of the controlled object. For these purposes, the signal is transformed into a small-sized image in the diagnostic feature space, moving along trajectories of different shapes, depending on the nature and magnitude of the changes. It is important to identify stable and deterministic patterns of changes in these complex-shaped diagnostic signals. Identification of such patterns largely depends on the principles of constructing a small-sized feature space. In the article, the space of decomposition coefficients of the measured signal in the adaptive orthonormal basis of canonical transformations is considered as such a space. In this case, the basis is constructed based on a representative sample of realizations of the controlled signal for various states of the system using the proposed algorithm. The identified shapes of the trajectories of the images correspond to specific types of deterministic changes in the signal. Analytical functional dependencies were discovered linking a specific type of signal change with the shape of the trajectory of the image in the feature space. The proposed approach, when used, simplifies modeling, operational diagnostics and condition monitoring during the implementation of, for example, low-frequency diagnostics and defectoscopy of structures, vibration diagnostics, monitoring of the stress state of an object by analyzing the time characteristics of response functions to impact.
Keywords: modeling, functional dependencies, state recognition, diagnostic image, image movement trajectories, small changes in diagnostic signals, canonical decomposition basis, analytical description of image trajectory
The article is devoted to describing approaches to analyzing the information space using low-code platforms in order to identify factors that form new identities of Azerbaijan and the unique features of the country’s information landscape. The article describes the steps to identify key themes and collect big data in the form of text corpora from various Internet sources and analyze the data. In terms of data analysis, the study of the sentiment of the text and the identification of opinion leaders is carried out; the article also includes monitoring of key topics, visualized for a visual presentation of the results.
Keywords: data analytics, trend monitoring, sentiment analysis, data visualization, low-code, Kribrum, Polyanalyst, big data
The problem of optimisation of selective assembly of plunger-housing precision joints of feeders of centralised lubrication systems used in mechanical engineering, metallurgy, mining, etc. is considered. The probability of formation of assembly sets of all types is used as the target function; the controlled variables are the number and volumes of parts of batches and their adjustment centres, as well as the values of group tolerances. Several variants of solving the problem at different combinations of controlled variables are considered. An example of the solution of the optimisation problem on the basis of the previously developed mathematical models with the given initial data and constraints is given, the advantages and disadvantages of each of the variants are outlined. Optimisation allows to increase the considered indicator by the value from 5% to 20%.
Keywords: selective assembly, lubrication feeder, precision connection, mathematical model, optimisation
Automating government processes is a top priority in the digital era. Because of historical development, many existing systems for registering and storing data about individuals coexist, requiring intervening IT infrastructures. The article considers the procedure for the development, creation and implementation of software for updating and generating data about residents of the city of Astana. It defines the functional capabilities and determines the role of the information system in automation and monitoring government activities. The authors conducted the study by observing, synthesizing, analyzing, systematizing, and classifying the data received. The authors used scientific works of local and foreign authors on the topic under study and open databases as sources of literature. At the end of the work, the authors list the literature used. The authors have, for the first time, created the structure and algorithms of the information system known as ""Population Database ""Geonomics"". Specifically, they have developed the mechanism and algorithm for the interaction of the ""Geonomics"" information system with government databases. As well as, additional opportunities for using the software have been identified by developing an algorithm for planning and placing social objects when using the information system ""Geonomics"". The authors have concluded that the algorithms developed for the use of the information system ""Population Database “Geonomics"" represent a reliable and powerful tool, which plays a critical role in the optimization and automation of processes related to population accounting and urban infrastructure management. This software contributes to the development of the city and the improvement of its residents' quality of life, based on up-to-date and reliable information. In addition, the developed algorithm allows for real-time monitoring of the current data of city residents and their density, based on which decisions can be made regarding the construction and placement of social facilities for the comfortable service and living of city residents.
Keywords: automation, updating, government activities, government agency, information system, database
A complex dynamic system is defined by a structurally invariant operator. The operator structure allows formulating problems of stabilizing program motions or equilibrium positions of a complex dynamic system with constraints on state coordinates and control. The solution of these problems allows synthesizing a structurally invariant operator of a complex dynamic system with inequality-constraints on the vector of locally admissible controls and state coordinates. Computational experiments confirming the correctness of the synthesized structurally invariant projection operator are performed.
Keywords: structurally-invariant operator, stabilization of program motions, complex nonlinear dynamic system, projection operator, SimInTech
Digital holographic microscopy (DHM), is a combination of digital holography and microscopy. It is capable of tracking transparent objects, such as organelles of living cells, without the use of fluorescent markers. The main problem of DHM is to increase an image spatial resolution while maintaining a wide field of view. The main approaches to solving this problem are: increasing of the numerical aperture of lighting and recording systems, as well as using deep learning methods. Increasing the numerical aperture of lighting systems is achieved by using oblique, structured or speckle illumination. For recording systems it is achieved by using hologram extrapolation, synthesis or super-resolution. Deep learning is usually used in conjunction with other methods to shorten the compute time. This article is dedicated to describe the basic principles and features of the above approaches.
Keywords: digital holographic microscopy, spatial resolution, field of view, numerical aperture, sample, light beam, CCD camera, diffraction, imaging system, super-resolution
Blurred frames pose a significant problem in various fields such as video surveillance, medical imaging and aerial photography, when solving the following object detection and identification, image-based disease diagnosis, as well as analyzing and processing data from drones to create maps and conduct monitoring. This article proposes a method for detecting blurred frames using a neural network model. The principle of operation of the model is to analyze images presented in the frequency domain in the Hough space. To further evaluate the effectiveness of the proposed author's solution, a comparison was made of existing methods and algorithms that can be used to solve the problem, namely the Laplacian method and the manual sampling method. The results obtained show that the proposed method has high accuracy in detecting blurred frames and can be used in systems where high accuracy and clarity of visual data are required for decision-making.
Keywords: blurred frames, motion blur, blur, Hough transform, spectral analysis
In this paper, a new intent and entity recognition model for the subject area of air passenger service, labelled as IRERAIR-TWIN, is developed using the ‘no code’ question-answer development platform ‘TWIN’. The advantages of the no-code platform were analysed in terms of the ease of developing an application question-answer system and reducing the amount of work involved in developing an application model for a narrow subject area. The results show that the ‘TWIN’ system provides an intuitive web-based user interface and a simpler approach to develop the semantic module of a question-answer system capable of solving application problems for a narrow subject area that are not overly complex. However, this approach has limitations for deep semantic analysis tasks, especially in complex contextual inference and processing of large text fragments. The paper concludes by emphasising that future research will focus on using ChatGPT-based ‘low code’ platforms and large language models to further improve the intelligence of the IRERAIR-TWIN model. This extension aims to broaden the scope of the scenarios.
Keywords: question-answering systems, No-code, Low-code, Intent recognition, Named entity recognition, Data annotation, Feature engineering, Pre-trained model, software development,End-user development