• Main Navigation
  • Main Content
  • Sidebar

Russian Digital Libraries Journal

  • Home
  • About
    • About the Journal
    • Aims and Scopes
    • Themes
    • Editor-in-Chief
    • Editorial Team
    • Submissions
    • Open Access Statement
    • Privacy Statement
    • Contact
  • Current
  • Archives
  • Register
  • Login
  • Search
Published since 1998
ISSN 1562-5419
16+
Language
  • Русский
  • English

Search

Advanced filters

Search Results

The Project Approach in Training of Information Security Students

Igor Vasilishin, Evgeny Khaimin, Lyudmila Khaimina
210-215
Abstract: This article discusses the project approach in the organization of educational activities. A brief description of educational and research projects in NArFU is given.
Keywords: projects, educational activities, digital environment, network projects, digital economy.

Application of Synthetic Data to the Problem of Anomaly Detection in the Field of Information Security

Artem Igorevich Gurianov
187-200
Abstract:

Currently, synthetic data is highly relevant in machine learning. Modern synthetic data generation algorithms make it possible to generate data that is very similar in statistical properties to the original data. Synthetic data is used in practice in a wide range of tasks, including those related to data augmentation.


The author of the article proposes a data augmentation method that combines the approaches of increasing the sample size using synthetic data and synthetic anomaly generation. This method has been used to solve an information security problem of anomaly detection in server logs in order to detect attacks.


The model trained for the task shows high results. This demonstrates the effectiveness of using synthetic data to increase sample size and generate anomalies, as well as the ability to use these approaches together with high efficiency.

Keywords: synthetic data, anomaly detection, information security, anomaly generation, data augmentation, machine learning.

Using Web-Quest Technology in Cybersecurity Training

Olga Troitskaya, Eva Vohtomina
195-201
Abstract: The need for schoolchildren to develop safe behavior skills in cyberspace is justified in the article. One way is to use web-quest technology. The article contains a brief description of this technology and an example of its use in teaching the basics of cybersecurity.
Keywords: web-quest, cybersecurity, cyberthreat, safe behavior.

Review of Technologies for Ensuring Security and Protection of Email Systems in a Scientific Organization

Gury Mikhailovich Mikhailov, Andrey Mikhailovich Chernetsov
1055-1063
Abstract:

The paper provides an overview of modern technologies used in processing email messages to solve the problem of receiving trusted email, and describes them. Recommended settings for successful operation are provided.

Keywords: e-mail, SPF, DMARC, DKIM.

Application of the Douglas-Peucker Algorithm in Online Authentication of Remote Work Tools for Specialist Training in Higher Education Group of Scientific Specialties (UGSN) 10.00.00

Anton Grigorievich Uymin, Vladimir Sergeyevich Grekov
679-694
Abstract:

In today's world, digital technologies are penetrating all aspects of human activity, including education and labor. Since 2019, when, in response to global challenges, the world's educational systems have actively started to shift to distance learning, there has been an urgent need to develop and implement reliable identification and authentication technologies. These technologies are necessary to ensure the authenticity of work and protection from falsification of academic achievements, especially in the context of higher education in accordance with the group of specialties and directions (USGS) 10.00.00 - Information Security, where laboratory and practical work play a key role in the educational process.


The problem lies in the need to optimize the flow of incoming data, which, first, can affect the retraining of the neural network core of the recognition system, and second, impose excessive requirements on the network's bandwidth. To solve this problem, efficient preprocessing of gesture data is required to simplify their trajectories while preserving the key features of the gestures.


This article proposes the use of the Douglas–Peucker algorithm for preliminary processing of mouse gesture trajectory data. This algorithm significantly reduces the number of points in the trajectories, simplifying them while preserving the main shape of the gestures. The data with simplified trajectories are then used to train neural networks.


The experimental part of the work showed that the application of the Douglas–Peucker algorithm allows for a 60% reduction in the number of points in the trajectories, leading to an increase in gesture recognition accuracy from 70% to 82%. Such data simplification contributes to speeding up the neural networks' training process and improving their operational efficiency.


The study confirmed the effectiveness of using the Douglas–Peucker algorithm for preliminary data processing in mouse gesture recognition tasks. The article suggests directions for further research, including the optimization of the algorithm's parameters for different types of gestures and exploring the possibility of combining it with other machine learning methods. The obtained results can be applied to developing more intuitive and adaptive user interfaces.

Keywords: authentication, biometric identification, remote work, distance learning, Douglas–Peucker algorithm, data preprocessing, neural network, HID devices, mouse gesture trajectories, data optimization.

Experience of Implementation of the Protocol TLS 1.3 Verification

Aleksey Vyacheslavovich Nikeshin, Victor Zinovievich Shnitman
902-922
Abstract:

This paper presents the experience of verifying server implementations of the TLS cryptographic protocol version 1.3. TLS is a widely used cryptographic protocol designed to create secure data transmission channels and provides the necessary functionality for this: confidentiality of the transmitted data, data integrity, and authentication of the parties. The new version 1.3 of the TLS protocol was introduced in August 2018 and has a number of significant differences compared to the previous version 1.2. A number of TLS developers have already included support for the latest version in their implementations. These circumstances make it relevant to do research in the field of verification and security of the new TLS protocol implementations. We used a new test suite for verifying implementations of the TLS 1.3 for compliance with Internet specifications, developed on the basis of the RFC8446, using UniTESK technology and mutation testing methods. The current work is part of the TLS 1.3 protocol verification project and covers some of the additional functionality and optional protocol extensions. To test implementations for compliance with formal specifications, UniTESK technology is used, which provides testing automation tools based on the use of finite state machines. The states of the system under test define the states of the state machine, and the test effects are the transitions of this machine. When performing a transition, the specified impact is passed to the implementation under test, after which the implementation's reactions are recorded and a verdict is automatically made on the compliance of the observed behavior with the specification. Mutational testing methods are used to detect non-standard behavior of the system under test by transmitting incorrect data. Some changes are made to the protocol exchange flow created in accordance with the specification: either the values of the message fields formed on the basis of the developed protocol model are changed, or the order of messages in the exchange flow is changed. The protocol model allows one to make changes to the data flow at any stage of the network exchange, which allows the test scenario to pass through all the significant states of the protocol and in each such state to test the implementation in accordance with the specified program. So far, several implementations have been found to deviate from the specification. The presented approach has proven effective in several of our projects when testing network protocols, providing detection of various deviations from the specification and other errors.

Keywords: security, TSL, TSLv1.3, protocols, testing, verification, evaluate robustness, Internet, standards, formal specifications.

Research of Data Processing, Detection and Protection Algorithms to Minimize the Impact of Malware and Phishing Attacks on Users of Digital Platforms

Tatiana Sergeevna Volokitina, Maxim Olegovich Tanygin
187-206
Abstract:

The article is devoted to the development of a scientific and methodological apparatus for improving the effectiveness of protecting digital platforms from cyber threats by creating processing and detection algorithms that take into account the cognitive characteristics of users. A conceptual model of a three-stage protection system is proposed, integrating technical security mechanisms with cognitive decision-making models. A heuristic detection algorithm based on Random Forest machine learning with analysis of 47 features, including technical URL characteristics and cognitive-semantic content characteristics, has been developed. A methodology for dynamic integration of four threat data sources has been created, reducing response time from 12–14 hours to two hours. An algorithm for recursive analysis of redirection chains up to ten levels deep to detect masked threats is proposed. Experimental validation on an empirical base of approximately one million records confirmed detection accuracy of 87% when processing one hundred thousand records per hour. The developed solutions ensure compliance with the requirements of GOST R 57580.1-2017 and Russian legislation in the field of personal data protection.

Keywords: heuristic threat detection, machine learning, cognitive security, phishing attacks, social engineering, data protection, threat source integration.

Информационно-аналитические системы в антикоррупционной борьбе

Николай Иванович Баяндин
63-71
Abstract:

The article tells about the using of new information technologies in anticorruption wars. Human factor as key element of anticorruption wars in business are explained. The information on programs of improvement of professional skill in anticorruption wars are given.

Keywords: corruption, anticorruption wars, human factor, information technologies, educational program of magisters.

Tula Online Tool for Balancing Video Games

Valeria Rashidovna Rakhmankulova, Vlada Vladimirovna Kugurakova
903-930
Abstract:

This paper presents the development of Tula, a tool for video game balancing. The necessity for such a tool is substantiated by the growing requirements for quality and cost-effectiveness in the video game industry, particularly in managing in-game economy and game world logic. The study analyzes existing tools and approaches to game balancing, identifying their limitations, which informed the design of the new tool's functionality. The presented tool integrates features of contemporary solutions while providing enhanced capabilities for game parameter analysis and testing, including prototype generation via class descriptions and real-time simulation. The technological foundation and architecture of the tool are described in detail. Key implementation aspects are discussed: interface responsiveness, continuous data synchronization, and security. Comparative analysis with Machinations revealed advantages in data processing correctness, interface convenience, and prototype modification flexibility.

Keywords: video games, gameplay, game mechanics, game balance, game design, Machinations.

Procedure for Comparing Text Recognition Software Solutions For Scientific Publications by the Quality of Metadata Extraction

Ilia Igorevich Kuznetsov , Oleg Panteleevich Novikov, Dmitry Yurievich ILIN
654-680
Abstract:

Metadata of scientific publications are used to build catalogs, determine the citation of publications, and perform other tasks. Automation of metadata extraction from PDF files provides means to speed up the execution of the designated tasks, while the possibility of further use of the obtained data depends on the quality of extraction. Existing software solutions were analyzed, after which three of them were selected: GROBID, CERMINE, ScientificPdfParser. A procedure for comparing software solutions for recognizing texts of scientific publications by the quality of metadata extraction is proposed. Based on the procedure, an experiment was conducted to extract 4 types of metadata (title, abstract, publication date, author names). To compare software solutions, a dataset of 112,457 publications divided into 23 subject areas formed on the basis of Semantic Scholar data was used. An example of choosing an effective software solution for metadata extraction under the conditions of specified priorities for subject areas and types of metadata using a weighted sum is given. It was determined that for the given example CERMINE shows efficiency 10.5% higher than GROBID and 9.6% higher than ScientificPdfParser.

Keywords: text recognition, scientific publications, metadata, data extraction quality, procedure.
1 - 10 of 10 items
Information
  • For Readers
  • For Authors
  • For Librarians
Make a Submission
Current Issue
  • Atom logo
  • RSS2 logo
  • RSS1 logo

Russian Digital Libraries Journal

ISSN 1562-5419

Information

  • About the Journal
  • Aims and Scopes
  • Themes
  • Author Guidelines
  • Submissions
  • Privacy Statement
  • Contact
  • eLIBRARY.RU
  • dblp computer science bibliography

Send a manuscript

Authors need to register with the journal prior to submitting or, if already registered, can simply log in and begin the five-step process.

Make a Submission
About this Publishing System

© 2015-2026 Kazan Federal University; Institute of the Information Society