• Main Navigation
  • Main Content
  • Sidebar

Russian Digital Libraries Journal

  • Home
  • About
    • About the Journal
    • Aims and Scopes
    • Themes
    • Editor-in-Chief
    • Editorial Team
    • Submissions
    • Open Access Statement
    • Privacy Statement
    • Contact
  • Current
  • Archives
  • Register
  • Login
  • Search
Published since 1998
ISSN 1562-5419
16+
Language
  • Русский
  • English

Search

Advanced filters

Search Results

Refutation of a Rumor by the Mass Media: Mathematical Model and Numerical Experiments

Alexander Petrovich Mikhailov, Alexander Petrov
371-386
Abstract:

The process is considered, in which an unreliable rumor spreads in society, which is opposed by the broadcasting of the mass media. In this case, the unreliability of hearing is understood so that the information of the media contains a refutation and thereby inoculates individuals, that is, makes them immune to hearing. At the same time, individuals who have managed to accept the rumor cease to trust the media and thereby become unavailable for persuasion. For this process, a mathematical model is proposed in two versions. The variant with continuous time reveals some of the mathematical properties of the model. The discrete time option is more convenient for analyzing real processes since it allows one to estimate the parameters of the model. To assess these parameters, data on the ratings of the main socio-political programs of Russian TV channels were used. Several scenario calculations of the model with these parameters are presented. The main conclusion is that if the information disseminated by the media is not viral, that is, it is not retold by viewers to their neighbors in society, then the media are unable to resist rumors.

Keywords: mathematical modeling, information warfare, numerical experiment, rumors.

Statistical Analysis of Observation Data of Air-Sea Interaction in the North Atlantic

Natalia Pavlovna Tuchkova, Konstantin Pavlovich Belyaev, Gury Mickailovich Mickailov
122-133
Abstract:

The observational data for 1979-2018 in the North Atlantic region are analyzed. These data were obtained as a result of the implementation of the project of the Russian Academy of Sciences for the study of the atmosphere in the North Atlantic (RAS-NAAD). The dataset provides many surface and free atmosphere parameters based on the sigma model and meets the many requirements of meteorologists, climatologists and oceanographers working in both research and operational fields. The paper analyzes the seasonal and long-term variability of the field of heat fluxes and water surface temperature in the North Atlantic. Schemes for analyzing diffusion processes were used as the main research method. Based on the given series of 40 years in length from 1979 to 2018, such parameters of diffusion processes as the mean (process drift) and variance (process diffusion) were calculated and their maps and time curves were constructed. Numerical calculations realized on the Lomonosov-2 supercomputer of the Lomonosov Moscow State University.

Keywords: UDC 519.6, UDC 519.2.

Methods of “living” the midi party of shock musical instruments

Азат Ленарович Шайхутдинов
273-282
Abstract: This study discusses some of the shortcomings of Gaussian humanization and discusses how the articulation models developed by drummers can be emulated using a probabilistic model. An author's algorithm for “revitalizing” midi-parts of percussion instruments was developed. Various dependencies and regularities, which are manifested when playing drums, are programmed. The dependence of the notes parameters not only on the parameters of the previous notes, but also on the subsequent notes of the corresponding parts of the drum set was created. The dependence of the impact force on the position of the note in the measure is created. Thus, emphasizing notes in a strong lobe. Also, the dependence of the note's volume on the coincidence with the notes of other parts of the drum set was created. To increase the dynamism and liveliness of the party, the volume of the hat was increased before the impact on the small drum. A comparison was made of the amplitudes of the corresponding notes of the parts, as far as the party, animated using a certain method, is different from the one played by a professional drummer. When listening to tests, batches processed using a modified method objectively sound lively than using Gaussian and quantized batches.
Keywords: notes, algorithm, midi-part, percussion instruments.

Recommendation System for Selection of Players in Team Sports Built on the Basis of Machine Learning

Rinat Rustemovich Shigapov, Alexander Andreevich Ferenets
257-280
Abstract:

This article describes the development of a recommender system for selecting players based on machine learning. The system introduced the example of hockey with the possibility of expanding its use in various team sports. For each sport different roles and characteristics of the players were considered. The article analyzes information about hockey, football, basketball and volleyball. The characteristics of the players are structured and divided into general groups. For each parameter coefficients are displayed that show the impact on the result of the match. Various machine learning algorithms were used to build the model. The web interface of the application has been created.

Keywords: sports, hockey, selection of players, recommender system, machine learning.

About the Criteria for Ranking Conferences

Alexander Sergeevich Kozitsyn
1001-1030
Abstract:

Ranking of scientific conferences plays a key role in the academic world, determining the level of significance and prestige of each event. The main results of ranking from the point of view of personalities are: determining the quality and influence of the scientific conference; a guide for selecting conferences; encouragement to conduct quality research; formation of the scientific community; improving the visibility and influence of the conference on the scientific community. The paper provides an overview of currently existing conference catalogs and conference ranking systems, both automatically and with the participation of expert councils. It is noted that the purpose of creating national ranking systems is to promote and popularize domestic conferences and journals. Based on the review of currently existing conference catalogs and conference ranking systems, the following criteria for ranking conferences can be formulated. Indicators of publication activity, based on the results of the analysis of published conference materials. The credibility of the speakers and the organizing committee of the conference. Number of presentations and the ratio of the number of presentations to the number of conference participants. Time for reviewing applications submitted to the conference. Ratio of submitted and accepted applications. Retrospective and geographical parameters.

Keywords: scientometrics, conferences, ranking, information systems.

Application of Supercomputer Technologies for Long-Term Modeling of Permafrost Boundaries in the Oil and Gas Fields of the Arctic

Mikhail Yurievich Filimonov, Nataliia Anatolyevna Vaganova, Elena Nikolaevna Akimova, Vladimir Evgenevich Misilov
848-865
Abstract: A model of propagation of thermal fields in permafrost from various engineering objects operating in Arctic regions is considered. The proposed model includes the most significant technical and climatic parameters affecting the formation of thermal fields in the surface layer of the soil. The main objective of the study is a long-term forecasting of changes in the dynamics of permafrost boundaries during operation of cluster sites of northern oil and gas fields. Such a forecast is obtained by simulation of complex system consisting of heat or cold sources and frozen soil, thawing of which can lead to the loss of the bearing capacity and possible technogenic and environmental accidents. For example, the sources of heat can be production wells, and the sources of cold can be seasonal cooling devices that are used to stabilize the soil. To minimize the impact of heat sources on permafrost, various options for thermal insulation are used, and to preserve the original temperature regime of the top layer of soil, riprap materials consisting of sand, concrete, foam concrete, or other heat insulating material are used. The developed set of programs was used in the design of 12 northern oil and gas fields. To solve the described problem in a complex three-dimensional area, substantial computational resources are required. The computing time of one variant can often exceed 10–20 hours of machine time on a supercomputer. To speed up the numerical calculations, multi-core processors are used. Numerical calculations illustrate the possibility of a developed set of programs for making long-term forecasts for determining changes in the boundaries of the permafrost zones, and show that on multi-core processors it is possible to achieve acceleration close to the theoretical one.
Keywords: computer software, heat and mass transfer, cryolithozone, simulation, parallel computing, Stefan problem, OpenMP.

Development of a System for Collecting Data on the Movement of People Indoors

Chingiz Irekovich Fatikhov, Karen Albertovich Grigorian
87-102
Abstract:

The COVID-19 pandemic makes the problem of monitoring and analyzing the movement of people indoors more urgent in order to timely identify those who have been in contact with the sick and prevent further spread of the infection.


The article proposes one of the ways to solve this problem - the development of a system for determining and saving the history of the location of people inside the premises. The article also discusses methods, parameters and technologies that can be used to solve the problem of indoor localization.

Keywords: location, localization, indoor positioning system, indoor location, IPS.

International Virtual Observatory: 10 years after

О.Ю. Малков, О.Б. Длужневская, О.С. Бартунов, И.Ю. Золотухин
Abstract: International Virtual Observatory (IVO) is a collection of integrated astronomical data archives and software tools that utilize computer networks to create an environment in which research can be conducted. Several countries have initiated national virtual observatory programs that will combine existing databases from ground-based and orbiting observatories and make them easily accessible to researchers. As a result, data from all the world's major observatories will be available to all users and to the public. This is significant not only because of the immense volume of astronomical data but also because the data on stars and galaxies have been compiled from observations in a variety of wavelengths: optical, radio, infrared, gamma ray, X-ray and more. Each wavelength can provide different information about a celestial event or object, but also requires a special expertise to interpret. In a virtual observatory environment, all of this data is integrated so that it can be synthesized and used in a given study. The International Virtual Observatory Alliance (IVOA) represents 17 international projects working in coordination to realize the essential technologies and interoperability standards necessary to create a new research infrastructure. Russian Virtual Observatory is one of the founders and important members of the IVOA. The International Virtual Observatory project was launched about ten years ago, and major IVO achievements in science and technology in recent years are discussed in this presentation. Standards for accessing large astronomical data sets were developed. Such data sets can accommodate the full range of wavelengths and observational techniques for all types of astronomical data: catalogues, images, spectra and time series. The described standards include standards for metadata, data formats, query language, etc. Services for the federation of massive, distributed data sets, regardless of the wavelength, resolution and type of data were developed. Effective mechanisms for publishing huge data sets and data products, as well as data analysis toolkits and services are provided. The services include source extraction, parameter measurements and classification from data bases, data mining from image, spectra and catalogue domains, multivariate statistical tools and multidimensional visualization techniques. Development of prototype VO services and capabilities implemented within the existing data centers, surveys and observatories are also discussed. We show that the VO has evolved beyond the demonstration level to become a real research tool. Scientific results based on end-to-end use of VO tools are discussed in the presentation.
Keywords: virtual observatory, e-science, astronomical data.

Designing a Tool for Creating Gameplay through the Systematization of Game Mechanics

Aleksey Vitalevich Shubin, Vlada Vladimirovna Kugurakova
774-795
Abstract:

A new approach to the development of a tool aimed at simplifying the workflow of a game designer is presented. The requirements are elaborated, the work scenario is developed and the main parameters for the developed tool are specified. The main objective of the tool is to speed up and facilitate the selection of proper game mechanics without the need to spend valuable time on lengthy analysis of other videogame projects.


To provide more effective work of game designers in the selection of game mechanics, we analyzed a variety of approaches to the classification of game mechanics. In the process of the research various methods of classification of game mechanics were considered, the analysis revealed which classifications are more suitable for decomposition of game mechanics. The results of the research allowed us to identify key aspects of game mechanics, which will serve as a foundation for the development of the tool.


This research represents an important step in creating a tool that will optimize the game design process and increase the speed of videogame development.

Keywords: game design, classification, game mechanics, automatization, videogame.

The formation of culture of students’ thinking during teaching mathematics

Elena Evgenievna Alekseeva
308-324
Abstract: The article is devoted to the problem of forming a culture of students’ thinking. It is noted that the formation of a culture of students’ thinking is interconnected with the process of formation and development of metadisciplinary actions, and the level of formation of cognitive skills characterizes the formation of a culture of students’ thinking. The solution is shown on the example of the organization of training of a functional content line integrated with tasks with parameters.
Keywords: culture of thinking, cognitive skills, formation, activity, mathematics, functional content line, task, parameter, teaching, students, methods, integration, program, advanced qualification.

Principles of Multitrading

Felix Osvaldovich Kasparinsky
808-869
Abstract:

Modern software and hardware tools provide unprecedented freedom for a variety of activities in the forex markets, from trading to analyzing the feasibility of models of nonlinear processes in self-organizing systems. To reduce risks and increase the efficiency of interaction with stock market instruments, it is proposed to provide variable adaptability of trading by combining trading strategies using several trading accounts of different brokers, multiple financial instruments, and Complex Indicators Tendencies of price changes. As a result of three years of experimental work, the basic principles of multitrading have been formulated and tested, and an information environment has been compiled, contributing to the development of an individualized trading system. The basic concept of organizing a multitrading information environment: the use of specialized hardware and software systems for strategic analysis and forecasting of price changes for an individual financial instrument, tactical selection of a promising financial instrument from the available set, and effective operating activities with orders of trading accounts. It can be expected that the evolution of the principles of multitrading will lead to the creation of analytical systems for predicting the kinetics of non-equilibrium changes in the characteristic parameters of self-organizing cooperative systems for wide application in biology, cybernetics, economics, and the social sphere.

Keywords: multitrading, trading, forex, technical analysis, investments, work organization, efficiency, financial market, oscillations, forecast.

Automation of Reading Related Data from Relational and Non-Relational Databases in the Context of using the JPA Standard

Angelina Sergeevna Savincheva, Alexander Andreevich Ferenets
656-678
Abstract:

The process of automating the management of the reading operation of related data from relational and non-relational databases is described.


The developed software tool is based on the use of the JPA (Java Persistence API) standard, which defines the capabilities of managing the lifecycle of entities in Java applications. An architecture for embedding in event processes has been designed, allowing the solution to be integrated into projects regardless of which JPA implementation is used. Support for various data loading strategies, types, and relationship parameters has been implemented. The performance of the tool has been evaluated.

Keywords: JPA, ORM, Java, databases, relational databases, non-relational databases.

Algorithm for linking translated articles using authorship statistics

Александр Сергеевич Козицын, Сергей Александрович Афонин, Андрей Александрович Зензинов
494-505
Abstract: During the last decades scientometric techniques have been used for research activity stimulation. Number of published articles and number of their citation counts are among the most important scientometric parameters. In an automated environment, when the publications metadata is gathered from various sources, correct linking of original papers with their translations into different languages is extremely important. In the paper we show that the known text similarity measures are inefficient in the context of article linkage problem. We propose a method for semi-automatic article linkage using statistical data on authors publication activities only. This approach may be used for linking articles without training for the language of translation. The method was evaluated on real-world collection of publications metadata of ISTINA information system.
Keywords: bibliographic data, graph analysis, translation, article, statistics, scientometrics, citation, automated systems.

Application of the Douglas-Peucker Algorithm in Online Authentication of Remote Work Tools for Specialist Training in Higher Education Group of Scientific Specialties (UGSN) 10.00.00

Anton Grigorievich Uymin, Vladimir Sergeyevich Grekov
679-694
Abstract:

In today's world, digital technologies are penetrating all aspects of human activity, including education and labor. Since 2019, when, in response to global challenges, the world's educational systems have actively started to shift to distance learning, there has been an urgent need to develop and implement reliable identification and authentication technologies. These technologies are necessary to ensure the authenticity of work and protection from falsification of academic achievements, especially in the context of higher education in accordance with the group of specialties and directions (USGS) 10.00.00 - Information Security, where laboratory and practical work play a key role in the educational process.


The problem lies in the need to optimize the flow of incoming data, which, first, can affect the retraining of the neural network core of the recognition system, and second, impose excessive requirements on the network's bandwidth. To solve this problem, efficient preprocessing of gesture data is required to simplify their trajectories while preserving the key features of the gestures.


This article proposes the use of the Douglas–Peucker algorithm for preliminary processing of mouse gesture trajectory data. This algorithm significantly reduces the number of points in the trajectories, simplifying them while preserving the main shape of the gestures. The data with simplified trajectories are then used to train neural networks.


The experimental part of the work showed that the application of the Douglas–Peucker algorithm allows for a 60% reduction in the number of points in the trajectories, leading to an increase in gesture recognition accuracy from 70% to 82%. Such data simplification contributes to speeding up the neural networks' training process and improving their operational efficiency.


The study confirmed the effectiveness of using the Douglas–Peucker algorithm for preliminary data processing in mouse gesture recognition tasks. The article suggests directions for further research, including the optimization of the algorithm's parameters for different types of gestures and exploring the possibility of combining it with other machine learning methods. The obtained results can be applied to developing more intuitive and adaptive user interfaces.

Keywords: authentication, biometric identification, remote work, distance learning, Douglas–Peucker algorithm, data preprocessing, neural network, HID devices, mouse gesture trajectories, data optimization.

Optimization of C++ Numerical Simulation Algorithms Using Multithreading Methods

Yuri Sergeevich Efimov
640-653
Abstract:

The main methods of numerical simulation (finite difference method, finite element method, Monte Carlo method, Runge–Kutta method) are presented. The main parameters used to optimize numerical modeling algorithms in terms of code execution time and efficient use of processor resources are considered. The main disadvantages of multithreading related to data synchronization, deadlocks and race conditions and methods for eliminating them based on the use of mutexes and atomic operations using the Monte Carlo method as an example were analyzed.

Keywords: programming language С , multithreading methods, numerical simulation, data synchronization.

Seasonal and Decadal Variability of Atmosphere Pressure in Arctic, its Statistical and Temporal Analysis

Konstantin Pavlovich Belyaev, Gury Mickailovich Mickailov, Alexey Nikolaevich Salnikov, Natalia Pavlovna Tuchkova
57-73
Abstract:

The paper analyzes the statistical and temporal seasonal and decadal variability of the atmospheric pressure field in the Arctic region of Russia. Schemes for the frequency analysis of probability transitions for characteristics of stochastic-diffusion processes were used as the main research method. On the basis of the given series of 60 years long from 1948 to 2008, such parameters of diffusion processes as the mean (drift process) and variance (diffusion process) were calculated and their maps and time curves were constructed. The seasonal and long-term variability of calculated fields was studied as well as their dependencies on a discretization of the frequency intervals. These characteristics were analyzed and their geophysical interpretation was carried out. In particular, the known cycles of solar activity in 11 and 22 years were revealed. Numerical calculations were performed on the Lomonosov-2 supercomputer of the Lomonosov Moscow State University.

Keywords: time series analysis, random diffusion processes, seasonal and long-term variability of atmospheric pressure.

Automated System for Selecting Optimal Methods for Solving Acoustic Problems Based on Ontology

Irina Leonidovna Artemieva, Alina Evgenevna Chusova
719-737
Abstract:

The report presents the software package that will allow specialists in the field of architectural acoustics to choose the most appropriate methods for modeling sound and selecting finishing materials depending on the tasks and parameters of a building A distinctive feature of this system is the presence of an ontology of the subject area that describes the terms and relationships between concepts, as well as modules for solving various problems in the field of architectural acoustics. Such an approach will allow the user to recommend the most suitable simulation methods for one’s request due to considering the specifics of the premises and the functional requirements of the client. The on-demand software system allows to optimize and parallelize programs written in a domain-specific programming language. The paper describes the principles of source code analysis to identify critical areas and modify them using a bank of patterns. The report also discusses an approach to develop a domain-specific programming language based on domain ontology, ODSL (Ontology-Based Domain-Specific Language), which allows specialists to describe algorithms not accounting for different specific optimization and parallelization methods. The novelty of the work lies in the proposed architecture of modules based on applied ontology, which makes it possible to adapt the solution to other subject areas.

Keywords: ontology, architectural acoustics, optimization, parallelism, ODSL.

Features of Developing an Electronic Resource "Materials to Syntactic Dictionary of XIX Century"

А.А. Котов, Г.Б. Гурин, А.В. Седов, М.Ю. Некрасов, Ю.В. Сидоров, А.А. Рогов
Abstract: This article describes a marked corpus of publicism of a XIX-th century in the original graphics (http://smalt.karelia.ru/corpus/index.phtml), also the substantiation the choice of theory and marking parameters is given, some of the annotation's complexities are discussed. The corpus basis consists of Vladimir Dal's texts, Fyodor Dostoyevsky's texts and articles of some publicists cooperating with him.
Keywords: corpus, marking, annotation, electronic dictionary.

MetaHuman Synthetic Dataset for Optimizing 3D Model Skinning

Rim Radikovich Gazizov, Makar Dmitrievich Belov
244-279
Abstract:

In this study, we present a method for creating a synthetic dataset using the MetaHuman framework to optimize the skinning of 3D models. The research focuses on improving the quality of skeletal deformation (skinning) by leveraging a diverse array of high-fidelity virtual human models. Using MetaHuman, we generated an extensive dataset comprising dozens of virtual characters with varied anthropometric fea-tures and precisely defined skinning weight parameters. This data was used to train an algorithm that optimizes the distribution of skinning weights between bones and the character mesh.


The proposed approach automates the weight rigging process, significantly reducing manual effort for riggers and increasing the accuracy of deformations during animation. Experimental results show that leveraging synthetic data reduces skinning errors and produces smoother character movements compared to traditional methods. The outcomes have direct applications in the video game, animation, virtual reality, and simulation industries, where rapid and high-quality rigging of numerous characters is required. The method can be integrated into existing graphics engines and development pipelines (such as Unreal Engine or Unity) as a plugin or tool, facilitating the adoption of this technology in practical projects.

Keywords: synthetic dataset, Metahuman, neural networks, 3D model skinning, computer animation, machine learning.

Image Classification Using Reinforcement Learning

Artem Aleksandrovich Elizarov , Evgenii Viktorovich Razinkov
1172-1191
Abstract:

Recently, such a direction of machine learning as reinforcement learning has been actively developing. As a consequence, attempts are being made to use reinforcement learning for solving computer vision problems, in particular for solving the problem of image classification. The tasks of computer vision are currently one of the most urgent tasks of artificial intelligence.


The article proposes a method for image classification in the form of a deep neural network using reinforcement learning. The idea of ​​the developed method comes down to solving the problem of a contextual multi-armed bandit using various strategies for achieving a compromise between exploitation and research and reinforcement learning algorithms. Strategies such as -greedy, -softmax, -decay-softmax, and the UCB1 method, and reinforcement learning algorithms such as DQN, REINFORCE, and A2C are considered. The analysis of the influence of various parameters on the efficiency of the method is carried out, and options for further development of the method are proposed.

Keywords: machine learning, image classification, reinforcement learning, contextual multi-armed bandit problem.

Neural Network for Generating Images Based on Song Lyrics using OpenAI and CLIP Models

Alsu Rishatovna Davletgareeva, Ksenia Aleksandrovna Edkova
437-455
Abstract:

The effectiveness of the ImageNet diffusion model and CLIP models for image generation based on textual descriptions was investigated. Two experiments were conducted using various textual inputs and different parameters to determine the optimal settings for generating images from text descriptions. The results showed that while ImageNet performed well in generating images, CLIP demonstrated better alignment between textual prompts and relevant images. The obtained results highlight the high potential of combining these mentioned models for creating high-quality and contextually relevant images based on textual descriptions.

Keywords: image generation, artificial intelligence, ImageNet diffusion model, CLIP, deep learning, neural networks, natural language processing.

Generative Methods for Creating Adaptive Playable Characters in Service Games

Timur Ruzelevich Arslanov
468-483
Abstract:

With the growing popularity of game services that require constant content updates to retain players, automating the generation of adaptive playable characters has become an urgent task. This article examines existing approaches to character generation, including evolutionary algorithms, and in-session adaptation systems. Current solutions are limited by their inability to provide sufficient long-term adaptation to individual player styles and their reliance on manual design.


To address these limitations, we propose a three-component system that integrates: player action modeling based on gameplay replays using reinforcement learning (RL) agents, character generation through combinatorial mechanics and parameter balancing, automatic validation via simulations to assess balance and alignment with a player’s individual style.


This work synthesizes contemporary research, highlighting the potential of generative methods to reduce development costs for game services. The results could accelerate prototyping and enhance the long-term viability of live-service projects.

Keywords: service games, game design, game characters, video game, procedural content generation.

New Method of Description of Eddy-Covariance Ecologic Data

Raoul Rashidovich Nigmatullin, Alexander Alekseevich Litvinov, Sergey Igorevich Osokin
41-75
Abstract:

In this paper, the authors propose the foundations of an original theory of quasi-reproducible experiments (QRE) based on the testable hypothesis that there exists an essential correlation (memory) between successive measurements. Based on this hypothesis, which the authors define for brevity as the verified partial correlation principle (VPCP), it can be proved that there exists a universal fitting function (UFF) for quasi-reproducible (QR) measurements. In other words, there is some common platform or "bridge" on which, figuratively speaking, a true theory (claiming to describe data from first principles or verifiable models) and an experiment offering this theory for verification measured data, maximally "cleaned" from the influence of uncontrollable factors and apparatus/software function, meet. Actually, the proposed theory gives a potential researcher the method of purification of initial data and finally suggests the curve that periodic and cleaned from a set of uncontrollable factors. The final curve corresponds to an ideal experiment.


The proposed theory has been tested on eddy covariance ecologic data related to the content of CH4, CO2 and water vapors of H2O in the local atmosphere where the corresponding detectors for measuring of the desired gases content are located.


For these tested eddy covariance data associated with the presence in atmosphere two gases CH4, CO2 and H2O vapors there is no simple hypothesis containing a minimal number of the fitting parameters, and, therefore, the fitting function that follows from this theory can serve as the only and reliable quantitative description of this kind of data belonging to the tested complex system. We should note also that the final fitting function removed from uncontrollable factors becomes pure periodic and corresponds to an ideal experiment. Applications of this theory to practical applications, the place of this theory among other alternative approaches, (especially touching the professional interests of ecologists) and its further development are discussed in the paper.


The paper examines the phenomenon of joint creativity of several authors, and provides examples from various fields of activity. The main attention is paid to information technologies: inventions made at the end of the 20th century are analyzed. Their authors are pairs of outstanding specialists who combined the talents of a programmer and a manager. They determined the further development of the IT industry and radically changed the quality of mankind’s way of life. The stories of the emergence of famous computers, operating systems, the World Wide Web, and network navigation tools are briefly described.

Keywords: quasi-reproducible experiments, complex systems, verified partial correlation principle, universal fitting function, quasi-periodic measurements, quasi-reproducible measurements, memory effects, eddy covariance.

Stability Studies of a Coupled Model to Perturbation of Initial Data

Konstantin Pavlovich Belyaev, Gury Mikhaylovich Mikhaylov, Alexey Nikolaevich Salnikov, Natalia Pavlovna Tuchkova
615-633
Abstract: The stability problem is considered in terms of the classical Lyapunov definition. For this, a set of initial conditions is set, consisting of their preliminary calculations, and the spread of the trajectories obtained as a result of numerical simulation is analyzed. This procedure is implemented as a series of ensemble experiments with a joint MPI-ESM model of the Institute of Meteorology M. Planck (Germany). For numerical modeling, a series of different initial values of the characteristic fields was specified and the model was integrated, starting from each of these fields for different time periods. Extreme ocean level characteristics over a period of 30 years were studied. The statistical distribution was built, the parameters of this distribution were estimated, and the statistical forecast for 5 years in advance was studied. It is shown that the statistical forecast of the level corresponds to the calculated forecast obtained by the model. The localization of extreme level values was studied and an analysis of these results was carried out. Numerical calculations were performed on the Lomonosov-2 supercomputer of Lomonosov Moscow State University.
Keywords: non-linear circulation models, Ensemble numerical experiments, analysis of stability of the model trajectories.
1 - 24 of 24 items
Information
  • For Readers
  • For Authors
  • For Librarians
Make a Submission
Current Issue
  • Atom logo
  • RSS2 logo
  • RSS1 logo

Russian Digital Libraries Journal

ISSN 1562-5419

Information

  • About the Journal
  • Aims and Scopes
  • Themes
  • Author Guidelines
  • Submissions
  • Privacy Statement
  • Contact
  • eLIBRARY.RU
  • dblp computer science bibliography

Send a manuscript

Authors need to register with the journal prior to submitting or, if already registered, can simply log in and begin the five-step process.

Make a Submission
About this Publishing System

© 2015-2025 Kazan Federal University; Institute of the Information Society