Spatial Orientation Of Objects Based On Processing Of A Natural Language Text For Storyboard Generation
Main Article Content
Abstract
The article is devoted to our approaches to processing text in natural language to clarify the specific spatial relationship of objects and three-dimensional frame-by-frame visualization. The proposed approach allows us to show how the explicit constraints of the extracted spatial relationships affect and makes it possible to create possible layouts of objects on the scene. Natural language interpretations for spatial knowledge can generate three-dimensional scenes, which in turn are necessary to translate the scriptwriter's intent into the design of video games. The work also takes into account the rules of directing to create successful shots. Among them, accounting for the plan, camera rotation, as well as compositional nuances.
Article Details
References
2. Сахибгареева Г.Ф. Разработка решения для визуализации сценарно-го прототипа инструментами генерации раскадровок // Выпускная квалифика-ционная работа магистратуры Высшей школы ИТИС. Казанский федеральный университет, Казань, 2020. 75 с.
URL: https://kpfu.ru/student_diplom/10.160.178.20_X87SBS2R8OQUD822Q1Y9MOGMYD_6QTSVCTR0JTXAC5Z54A_W6A_Sahibgareeva.pdf.
3. Сахибгареева Г.Ф. Разработка инструмента для создания игрового сценарного прототипа: выпускная квалификационная работа бакалавриата Выс-шей школы ИТИС. Казанский федеральный университет, Казань, 2018. 57 с.
URL: https://kpfu.ru/student_diplom/10.160.178.20_9311629_F_Sahibgareeva.pdf.
4. Хейг М. Голливудский стандарт: как написать сценарий для кино и ТВ, который купят / Майкл Хейг; Пер. с англ. М.: Альпина нон-фикшн, 2017. 388 с.
5. Lee N., Madej K. Disney Stories: Getting to Digital. Springer, 2012. 114 с.
6. Storyboarder: Wonder Unit. Режим доступа:
URL: https://wonderunit.com/storyboarder/ (Дата обращения: 12.05.2020).
7. Thompson R. Grammar of the Shot. Focal Press, 1998. 219 p.
8. Hockrow R. Out of Order: Storytelling Techniques for Video and Cinema Editors. Peachpit Press, 2014. 240 p.
9. Dise J. Filmmaking 101: Camera Shot // Explora, 2017. Режим доступа: URL: https://www.bhphotovideo.com/explora/video/tips-andsolutions/filmm aking-101-camera-shot-types (Дата обращения: 6.04.2020).
10. Gallea R., Ardizzone E., Pirrone R. Automatic aesthetic photo composition // International Conference on Image Analysis and Processing. Springer, Berlin, Heidelberg, 2013. P. 21–30.
11. Olivier P., Tsujii J. A computational view of the cognitive semantics of spatial prepositions // Proceedings of the 32nd annual meeting on Association for Computational Linguistics. Association for Computational Linguistics, 1994. P. 303–309.
12. Egizii M.L. et al. Which way did he go? Directionality of film character and camera movement and subsequent spectator interpretation // International Communication Association conference. Phoenix, AZ, 2012. Режим доступа:
URL: https://www.researchgate.net/publication/228448619_Which_Way_Did_He_Go_Directionality_of_Film_Character_and_Camera_Movement_and_Subsequent_Spectator_Interpretation (Дата обращения: 28.04.2020).
13. Coyne B., Sproat R. WordsEye: an automatic text-to-scene conversion system // Proceedings of the 28th annual Conference on Computer Graphics and In-teractive Techniques, 2001. P. 487–496.
14. Ulinski M., Coyne B., Hirschberg J. Evaluating the WordsEye Text-to-Scene System: Imaginative and Realistic Sentences // Proceedings of the Eleventh International Conference on Language Resources and Evaluation, 2018. P. 1493–1499.
15. Coyne B., Sproat R., Hirschberg J. Spatial relations in text-to-scene con-version // Computational Models of Spatial Language Interpretation, Workshop at Spatial Cognition, 2010. 8 p.
16. Coyne B., Klapheke A., Rouhizadeh M., Sproat R., Bauer D. Annotation tools and knowledge representation for a text-to-scene system // Proceedings of COLING 2012: Technical Papers. 2012. P. 679–694.
17. Elliott D., Keller F. Image description using visual dependency representa-tions // Proceedings of Empirical Methods in Natural Language Processing (EMNLP). 2013. P. 1292–1302.
18. Chang A.X., Savva M., Manning Ch.D. Learning Spatial Knowledge for Text to 3D Scene Generation // Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP). 2014. P. 2028–2038.
19. spaCy: библиотека для NLP // Официальный сайт spaCy с документа-цией. Режим доступа: URL: https://spacy.io (Дата обращения: 10.06.2020).
20. Нгуен А.З. Генерация окружения на основе текстового описания: вы-пускная квалификационная работа бакалавриата Высшей школы ИТИС. Казан-ский федеральный университет, Казань, 2020. 45 с.
URL: https://kpfu.ru/student_diplom/10.160.178.20_0W9ARNOUY_98UU56A_D0JDQMEBFOF__66GZP8OQRFCLC4RY26N_F_Nguyen.pdf.
21. Hassani K., Lee W.-S. Visualizing Natural Language Descriptions // ACM Computing Surveys. 2016. V. 49(1). P. 1–34.
22. Астафьев А.М. Разработка инструмента для сборки сцен по тегам: выпускная квалификационная работа бакалавриата Высшей школы ИТИС. Казанский федеральный университет, Казань, 2020. 75 с.
URL: https://kpfu.ru/student_diplom/10.160.178.20_FPEBER9KDIZQVYJAE3VRTIFYWZB_CDDM972OPP2I28S0EEFABT_Astafev.pdf.
23. Episodes of “The Bridge” / Киносериал // BBC. 2020. Режим доступа: URL: https://www.bbc.co.uk/programmes/b01gxlxj/episodes/guide (Дата обраще-ния: 24.04.2020).
24. Сахибгареева Г.Ф., Кугуракова В.В. Концепт инструмента автомати-ческого создания сценарного прототипа компьютерной игры // Электронные библиотеки. 2018. Т. 21, №3-4. C. 235–249.
25. Антонов И.О., Зезегова К.В., Кугуракова В.В., Лазарев Е.Н., Хафи-зов М.Р. Программирование запахов для виртуального осмотра места происше-ствия // Электронные библиотеки. 2018. T. 21, № 3-4. C. 301–313.
Presenting an article for publication in the Russian Digital Libraries Journal (RDLJ), the authors automatically give consent to grant a limited license to use the materials of the Kazan (Volga) Federal University (KFU) (of course, only if the article is accepted for publication). This means that KFU has the right to publish an article in the next issue of the journal (on the website or in printed form), as well as to reprint this article in the archives of RDLJ CDs or to include in a particular information system or database, produced by KFU.
All copyrighted materials are placed in RDLJ with the consent of the authors. In the event that any of the authors have objected to its publication of materials on this site, the material can be removed, subject to notification to the Editor in writing.
Documents published in RDLJ are protected by copyright and all rights are reserved by the authors. Authors independently monitor compliance with their rights to reproduce or translate their papers published in the journal. If the material is published in RDLJ, reprinted with permission by another publisher or translated into another language, a reference to the original publication.
By submitting an article for publication in RDLJ, authors should take into account that the publication on the Internet, on the one hand, provide unique opportunities for access to their content, but on the other hand, are a new form of information exchange in the global information society where authors and publishers is not always provided with protection against unauthorized copying or other use of materials protected by copyright.
RDLJ is copyrighted. When using materials from the log must indicate the URL: index.phtml page = elbib / rus / journal?. Any change, addition or editing of the author's text are not allowed. Copying individual fragments of articles from the journal is allowed for distribute, remix, adapt, and build upon article, even commercially, as long as they credit that article for the original creation.
Request for the right to reproduce or use any of the materials published in RDLJ should be addressed to the Editor-in-Chief A.M. Elizarov at the following address: amelizarov@gmail.com.
The publishers of RDLJ is not responsible for the view, set out in the published opinion articles.
We suggest the authors of articles downloaded from this page, sign it and send it to the journal publisher's address by e-mail scan copyright agreements on the transfer of non-exclusive rights to use the work.