Development of a Visual Perception System for Game Agents in Video Games
Main Article Content
Abstract
The developed algorithm of the visual perception system for game agents, implemented in the Unity game engine, is presented. The proposed method is based on the comparison of images from two cameras, taking into account complex visual effects (lighting, shadows, camouflage), and supplemented with line-of-sight verification, taking into account the speed of the object, and the mechanics of gradual detection. Testing of the system has shown a significant increase in realistic detection compared to traditional methods, while maintaining performance within a small additional load on the processor. The algorithm was optimized using Unity Job System and dynamic camera activation. The scientific literature on similar solutions has also been analyzed and their strengths and weaknesses have been identified. The results can be applied in video game development to create realistic behavior of non-player characters, especially in games with stealth elements.
Article Details
References
2. Рабинович З.Л. Методы и алгоритмы искусственного интеллекта в компьютерных играх: учеб. пособие. М.: Физматлит, 2018. 320 с.
3. Петров А.В. Искусственный интеллект в трехмерных играх. Программирование и моделирование поведения персонажей. М.: ДМК Пресс, 2019. 452 с.
4. Buckland M. Programming Game AI by Example. Sudbury: Jones & Bartlett Learning, 2022. 522 c.
5. Ostuni D., Galante E.T. Towards an AI playing Touhou from pixels: a dataset for real-time semantic segmentation // 2021 IEEE Conference on Games (CoG). 2021. P. 1–5. https://doi.org/10.1109/CoG52621.2021.9619112.
6. Tutum C., AbdulQuddos S., Miikkulainen R. Generalization of Agent Behavior through Explicit Representation of Context // 2021 IEEE Conference on Games (CoG). 2021. P. 1–7. https://doi.org/10.1109/CoG52621.2021.9619141.
7. Guerrero-Romero C., Perez-Liebana D. MAP-Elites to Generate a Team of Agents that Elicits Diverse Automated Gameplay // 2021 IEEE Conference on Games (CoG). 2021. P. 1–8. https://doi.org/10.1109/CoG52621.2021.9619142.
8. Glassner A.S. (Ed.). An introduction to ray tracing. Morgan Kaufmann, 1989.
9. Panwar H. The NPC AI of the last of us: a case study // arXiv preprint arXiv:2207.00682. 2022.
10. Mahmoud I., Jaffal Y., Wloka D. A Vision Simulation Algorithm for Non-Player Character in Static Scene // University of Kassel, Germany. 2014. P. 6.
11. Tremblay J., Torres P.A., Verbrugge C. Measuring Risk in Stealth Games // Foundations of Digital Games. Liberty of the Seas, 2014. P. 8.
12. NPC Eyes Sight System – PRO. URL: https://www.fab.com/listings/6b54716a-dd21-414d-b78f-384068de14b7
13. Erdelyi C. Using Computer Vision Techniques to Play an Existing Video Game // California State University San Marcos. 2019. P. 49.
14. Паренюк Л.Н., Кугуракова В.В. Разработка плагина поведения NPC для игрового движка Unity // Электронные библиотеки. 2020. T. 23(5). С. 1044–1057. http://doi.org/10.26907/1562-5419-2020-23-5-1044-1057.
15. Estgren M. Modelling NPC perception using supervised learning // Uppsala University, Sweden. 2021. 8 p. URL: https://sciion.se/assets/papers/npc-perception.pdf
16. Bourg D.M., Seemann G. AI for Game Developers. O’Reilly Media, Inc. 2004.
17. Jack M. Tactical Position Selection: An Architecture and Query Language. In Game AI Pro 360. CRC Press. 2019. P. 1–24.
18. McIntosh T. Human Enemy AI in The Last of Us. In Game AI Pro 360. CRC Press, 2019. P. 13–24.
19. Welsh R. Crytek’s Target Tracks Perception System. In Game AI Pro: Collected Wisdom of Game AI Professionals. 2013. Vol. 403. 411 p.
20. Walsh M. Modeling Perception and Awareness in Tom Clancy’s Splinter Cell Blacklist. In Game AI Pro 360. CRC Press. 2019. P. 73–86.
21. Ying Z., Edwards N., Kutuzov M. Efficient Visibility Approximation for Game AI using Neural Omnidirectional Distance Fields // Proceedings of the ACM on Computer Graphics and Interactive Techniques. 2024. Vol. 7, No. 1. P. 1–15.
22. Image Comparison Tuned to Human Perception // Computer Science Stack Exchange. 2015. URL: https://cs.stackexchange.com/questions/48862/image-comparison-tuned-to-human-perception
23. Pramod R.T., Katti H., Arun S.P. Human peripheral blur is optimal for object recognition // Vision Research. 2022. Vol. 200. P. 108083.
24. Fastest Gaussian Blur (in Linear Time) // Algorithms and Stuff. 2014. URL: https://blog.ivank.net/fastest-gaussian-blur.html
25. Кугуракова В.В., Бедрин О.А. Система автоматизации функционального тестирования для платформы Unity // Вестник компьютерных и информационных технологий. 2020. Т. 17, № 12. С. 47–52. https://doi.org/10.14489/vkit.2020.12.pp.047-052

This work is licensed under a Creative Commons Attribution 4.0 International License.
Presenting an article for publication in the Russian Digital Libraries Journal (RDLJ), the authors automatically give consent to grant a limited license to use the materials of the Kazan (Volga) Federal University (KFU) (of course, only if the article is accepted for publication). This means that KFU has the right to publish an article in the next issue of the journal (on the website or in printed form), as well as to reprint this article in the archives of RDLJ CDs or to include in a particular information system or database, produced by KFU.
All copyrighted materials are placed in RDLJ with the consent of the authors. In the event that any of the authors have objected to its publication of materials on this site, the material can be removed, subject to notification to the Editor in writing.
Documents published in RDLJ are protected by copyright and all rights are reserved by the authors. Authors independently monitor compliance with their rights to reproduce or translate their papers published in the journal. If the material is published in RDLJ, reprinted with permission by another publisher or translated into another language, a reference to the original publication.
By submitting an article for publication in RDLJ, authors should take into account that the publication on the Internet, on the one hand, provide unique opportunities for access to their content, but on the other hand, are a new form of information exchange in the global information society where authors and publishers is not always provided with protection against unauthorized copying or other use of materials protected by copyright.
RDLJ is copyrighted. When using materials from the log must indicate the URL: index.phtml page = elbib / rus / journal?. Any change, addition or editing of the author's text are not allowed. Copying individual fragments of articles from the journal is allowed for distribute, remix, adapt, and build upon article, even commercially, as long as they credit that article for the original creation.
Request for the right to reproduce or use any of the materials published in RDLJ should be addressed to the Editor-in-Chief A.M. Elizarov at the following address: amelizarov@gmail.com.
The publishers of RDLJ is not responsible for the view, set out in the published opinion articles.
We suggest the authors of articles downloaded from this page, sign it and send it to the journal publisher's address by e-mail scan copyright agreements on the transfer of non-exclusive rights to use the work.