Published: 12.12.2024
Full Issue
Articles
On Presentation of the Results of a Scientific Institute in the Form of a Knowledge Graph in a Semantic Library
The problem of presenting scientific results of academic institute in a digital environment is considered. A new look at the knowledge space of a scientific institute constitutes a natural stage in the development of WEB technologies. The data structure inherent in previous studies allows you to organize search and navigation through them using a knowledge graph, like a version of the semantic library LibMeta. The knowledge graph gives a more complete and high-quality idea of the knowledge space, often removing the cognitive load in the perception of complex structures and data connections.
Alive Publications are Gaining Popularity
An alive publication is a new genre for presenting the results of scientific research, which means that scientific work is published online, and then constantly developing and improving by its author. Serious errors and typos are no longer fatal, nor do they haunt the author for the rest of his or her life. The reader of an alive publication knows that the author is constantly monitoring changes occurring in this branch of science. Meanwhile at present, the Russian author who supports an alive publication is dramatically losing out on many generally accepted bibliometric indicators. The alive publication encourages the development of the bibliography apparatus. Each bibliographic reference will soon have to contain such important for the reader updating on-the-fly attribute as date of the last revision of alive publication. It is to be expected that as the alive publication spreads over to the scientific world, the author's concern for the publication's evolution will become like a parent's care for the development of a child. The Internet will be filled with scientific publications that do not lose their relevance over time.
Selecting Solutions in an Educational Programming Language Simulator
The article is devoted to the development of solutions in the project of a simulator for teaching programming, intended for initial familiarization with the basic concepts of process interaction and calculation management. No matter how complex the world of parallelism is, the programmer training system will have to master it and create a methodology for fully familiarizing itself with its non-obvious phenomena. The simulator is based on the experience of controlling the interaction of toy robots moving on a checkered board. The article material is of interest to programmers, students and graduate students specializing in the field of system and theoretical programming.
Development of Lightweight Parsers with Different Go Language Granularity
We consider an approach to creating a family of lightweight grammars with the Any symbol denoting skipping code parts [1]. Definition and examples of increasing the granularity of grammar rules are given. Memory and time efficiency of lightweight parsers is analyzed on seven industrial repositories. It is shown that increasing grammar granularity does not significantly increase parser resource consumption and varies slightly depending on repository type and Go writing style. Furthermore, the advantages of using lightweight grammars with Any over full grammars are summarized. An example of using a lightweight grammar to determine code complexity is presented. In addition, the results can be applied to estimate the parser's share of the total resource consumption, for example in the task of code binding and project markup.
Analytical Statistics about Scientific Publications of the Kazan Federal University on Scilit
The paper examines issues related to the presentation of information about publications of KFU researchers, teachers, graduate students and students, as well as about the University’s scientific sources in information and analytical materials of the Scilit system. Specific examples show the advantage of complete and correct setting of metadata for scientific publications, as well as the problems that arise when handling bibliographic information carelessly.
Multitrading System Forecasts
The article is devoted to the problem of forecasting trends in the prices of financial instruments on the Forex market. The methods of forming forecasts based on business cycle models and fractal self-organization of pricing are considered. Based on historical precedents of crises after 1812 and 1917, the terms of the 2015-2027 crisis are determined, the end of which coincides with the simultaneous end of the 200-year and 40-year trends. It is predicted that the point of technological singularity will be reached in 2039. The methods of integrating fundamental and technical analysis tools for forecasting global events that are absent from the economic calendar are developed. It is proposed to increase the efficiency of forecasting changes in prices of financial instruments using an analytical multitrading system designed to work with 6 strategies: long-term (8 months), medium-term (2 months), short-term (1.5 weeks and 1.5 days) and intraday (8 hours, 2 hours). The choice of strategy depends on the time that a trader is ready to use for analytical activities and control of open transactions, acceptable risks and expected profitability. For all strategies, a set of preferred currency pairs of Forex Club and FxPro brokers is established, recommendations for traders are given. The necessary and sufficient set of technical analysis indicators participating in the formation of a triple signal for positioning the starting point of the Regression Channel is determined, which allows automatic forecasting of tactical levels of the price change trend reversal in the interval of forming groups of 8 oscillations. A regulation has been developed for the creation, publication and verification of tactical forecasts of the duration and amplitude of price oscillations for all strategies and many financial instruments. Forecasts are published in the "Multitrading" channels and groups of the Telegram, Dzen and VKontakte network services. The forecasting tools are supposed to be used in the formation of multitrading system tactics.
About the Criteria for Ranking Conferences
Ranking of scientific conferences plays a key role in the academic world, determining the level of significance and prestige of each event. The main results of ranking from the point of view of personalities are: determining the quality and influence of the scientific conference; a guide for selecting conferences; encouragement to conduct quality research; formation of the scientific community; improving the visibility and influence of the conference on the scientific community. The paper provides an overview of currently existing conference catalogs and conference ranking systems, both automatically and with the participation of expert councils. It is noted that the purpose of creating national ranking systems is to promote and popularize domestic conferences and journals. Based on the review of currently existing conference catalogs and conference ranking systems, the following criteria for ranking conferences can be formulated. Indicators of publication activity, based on the results of the analysis of published conference materials. The credibility of the speakers and the organizing committee of the conference. Number of presentations and the ratio of the number of presentations to the number of conference participants. Time for reviewing applications submitted to the conference. Ratio of submitted and accepted applications. Retrospective and geographical parameters.
On the Interaction of the Common Digital Space of Scientific Knowledge with the National Electronic Library
his article focuses on interaction between the Common Digital Space of Scientific Knowledge (CDSSK) and the National Electronic Library (NEB). The primary architectural features and objectives of the CDSSK are introduced. The features of the NEB structure, filing technology are examined, and the current composition of library collections is analyzed. The establishment and activities of the NEB are being considered within the legal framework. The areas of interaction between the CDSSK and the NEB are being proposed.
Review of Technologies for Ensuring Security and Protection of Email Systems in a Scientific Organization
The paper provides an overview of modern technologies used in processing email messages to solve the problem of receiving trusted email, and describes them. Recommended settings for successful operation are provided.
An Approach to Creating an HTML Version of a Scientific Article from a Manuscript in MS Word Format for a Low-Budget Publisher
The most common approach to creating an HTML version of a journal article among scientific publishers is to first create an XML version of the article in accordance with the NISO Journal Article Tag Suite (JATS) standard, followed by automatic conversion to HTML and PDF formats. However, obtaining an XML version from a manuscript in the .docx format of the MS Word word processor, often used by authors, when it contains a large number of complex formulas and tables is a difficult task. The existing software either does not cope with it in full or is expensive and inaccessible to small publishers with a limited budget. This paper proposes an approach to creating an HTML version of a journal article from a manuscript in .docx format containing formulas in MathType format, which does not require significant financial and time costs from the publisher. It also describes a currently implemented prototype of an underlied this approach converter of scientific articles from .docx format to HTML and JATS XML formats, which is applicable for KIAM preprints.
Analysis of Intra-Annual Variability of Heat Fluxes in the North Atlantic Based on Approximation of Trajectories of the Stochastic Diffusion Process
To analyze heat fluxes, observational data for 1979-2018 were used for the North Atlantic. The spatiotemporal variability of the total heat flux was modeled by a stochastic diffusion process. The coefficients of the stochastic differential equation were estimated by using nonparametric statistics. Previously, the existence and uniqueness of a solution in the strong sense of the stochastic differential equation generated by the constructed diffusion process was proven when Kolmogorov's conditions were met. In this work, the coefficients of the equation were approximated in time by trigonometric polynomials, the amplitudes and phases of which depended on the flow values. Using a given series of 40 years in length from 1979 to 2018, spatial maps and time curves were constructed. The results are shown for 1999 and 2018, and their comparative analysis is also carried out. Numerical calculations were realized on the Lomonosov-2 supercomputer of the Lomonosov Moscow State University.
An Ontology-Based Approach for Distributed Multi-Agent Modeling of the Radio-Technical Systems
he ontology-based approach to multi-agent modeling involves the implementation of a modeling system through the creation of ontologies. An example of a holistic implementation of an ontology-based approach to agent-based modeling is the IEEE 1516 Standard for Modeling and Simulation High Level Architecture. The work is devoted to a multi-agent modeling system designed for modeling complex radio engineering systems (especially radar systems), its relevance is due to the need to replace part of the field tests of radio engineering systems with simulation experiments. The motivation for switching to the IEEE 1516 standard for a "heavy" multi-agent modeling system, among other things, is to ensure scalability, openness and multiple reuse of the developed agent models, which is completely logical to do based on the existing well-developed and proven standard that establishes rules for the interaction of models and the development of software interfaces. The general principles of construction and architecture of the modeling system are given. The basic requirements for the main modeling agents, their role and place in the complex modeling system are shown, a special place among which is occupied by the simulator of the background-target environment. The possibility of combining two simulation schemes is also discussed: discrete-event and step-by-step. The fact is that the step-by-step scheme has advantages such as simplicity and clarity, it is convenient to model processing algorithms, components of radio engineering systems. However, it is impossible to implement true autonomy and asynchrony of agents in it. Combining two modeling schemes allows you to combine their advantages.