Software Framework for Implementing User Interface Interaction in IOS Applications Based on Oculography
Main Article Content
Abstract
Usage of gaze tracking technologies for the purpose of user interface interaction in iOS applications is significantly hampered by the absence of a unified approach to their integration. Current solutions are either strictly limited to their own use-case or made solely for research purposes and thus inapplicable to real-world problems. The focus of this article is the development of a software framework that performs gaze tracking using native technologies and suggests a unified approach to the development of gaze-driven iOS applications.
Keywords:
Article Details
References
2. Roig-Maimó M.F. et al. Evaluation of a mobile head-tracker interface for accessibility // International Conference on Computers Helping People with Special Needs. Springer, Cham, 2016. P. 449–456. https://doi.org/10.1007/978-3-319-41267-2_63
3. Abbaszadegan M., Yaghoubi S., MacKenzie I.S. TrackMaze: A comparison of head-tracking, eye-tracking, and tilt as input methods for mobile games // International Conference on Human-Computer Interaction. Springer, Cham, 2018. P. 393–405. https://doi.org/10.1007/978-3-319-91250-9_31
4. Tupikovskaja-Omovie Z., Tyler D. Clustering consumers' shopping journeys: eye tracking fashion m-retail // Journal of Fashion Marketing and Management: An International Journal. 2020. Т. 24. №. 3. P. 381–398. https://doi.org/10.1108/JFMM-09-2019-0195
5. Garbutt M. et al. The embodied gaze: Exploring applications for mobile eye tracking in the art museum // Visitor Studies. 2020. Vol. 23. No. 1. P. 82–100. https://doi.org/10.1080/10645578.2020.1750271
6. Vogt M., Rips A., Emmelmann C. Comparison of iPad Pro®’s LiDAR and TrueDepth capabilities with an industrial 3D scanning solution // Technologies. 2021. Vol. 9. No. 2. P. 25. https://doi.org/10.3390/technologies9020025
7. Breitbarth A. et al. Measurement accuracy and dependence on external influences of the iPhone X TrueDepth sensor // Photonics and Education in Measurement Science 2019. International Society for Optics and Photonics, 2019. Vol. 11144. P. 1114407. https://doi.org/10.1117/12.2530544
8. Number of smartphone users worldwide from 2016 to 2023 // Statista – The Statistics Portal for Market data, Market Research and Market Studies. URL: https://www.statista.com/statistics/330695/number-of-smartphone-users-worldwide/
9. Krafka K. et al. Eye tracking for everyone // Proceedings of the IEEE Conference On Computer Vision And Pattern Recognition. 2016. P. 2176–2184. https://doi.org/10.1109/CVPR.2016.239
10. Huang M. X. et al. Screenglint: Practical, in-situ gaze estimation on smartphones // Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 2017. P. 2546–2557. https://doi.org/10.1145/3025453.3025794
11. Brousseau B., Rose J., Eizenman M. Smarteye: An accurate infrared eye tracking system for smartphones // 2018 9th IEEE Annual Ubiquitous Computing, Electronics & Mobile Communication Conference (UEMCON). IEEE, 2018. P. 951–959. https://doi.org/10.1109/UEMCON.2018.8796799
12. Hawkeye Access | Control your iOS device using your eyes. URL: https://www.usehawkeye.com/accessibility
13. Kong A. et al. EyeMU Interactions: Gaze+ IMU Gestures on Mobile Devices // Proceedings of the 2021 International Conference on Multimodal Interaction. 2021. P. 577–585. https://doi.org/10.1145/3462244.3479938
14. Skyle 2 for iPad – eyeV. URL: https://eyev.de/en/ipad/
15. Cicek M. et al. Mobile head tracking for ecommerce and beyond //Electronic Imaging. 2020. Vol. 2020. No. 3. P. 303-1–303-12. https://doi.org/10.48550/arXiv.1812.07143
16. Kaufman A.E., Bandopadhay A., Shaviv B.D. An eye tracking computer user interface // Proceedings of 1993 IEEE Research Properties in Virtual Reality Symposium. IEEE, 1993. P. 120–121. https://doi.org/10.1109/VRAIS.1993.378254
17. Gibaldi A. et al. Evaluation of the Tobii EyeX Eye tracking controller and Matlab toolkit for research // Behavior Research Methods. 2017. Vol. 49. No. 3. P. 923–946. https://doi.org/10.3758/s13428-016-0762-9
18. Xu P. et al. Turkergaze: Crowdsourcing saliency with webcam based eye tracking //arXiv preprint arXiv:1504.06755. 2015. https://doi.org/10.48550/arXiv.1504.06755
19. Qiao X. et al. A new era for web AR with mobile edge computing // IEEE Internet Computing. 2018. Vol. 22. No. 4. P. 46–55. https://doi.org/10.1109/MIC.2018.043051464
20. Taaban R.A., Croock M.S., Korial A.E. Eye Tracking Based Mobile Application // International Journal of Advanced Research in Computer Engineering & Technology (IJARCET). 2018. Vol. 7. No. 3. P. 246–250.
21. Heryadi Y. et al. Mata: An Android Eye-Tracking Based User Interface Control Application // Journal of Games, Game Art, and Gamification. 2016. Vol. 1. No. 1. P. 35–40. https://doi.org/10.21512/jggag.v1i1.7249
22. Greinacher R., Voigt-Antons J.N. Accuracy Assessment of ARKit 2 Based Gaze Estimation // International Conference on Human-Computer Interaction. Springer, Cham, 2020. P. 439–449. https://doi.org/10.1007/978-3-030-49059-1_32
23. devicekit/DeviceKit: DeviceKit is a value-type replacement of UIDevice. URL: https://github.com/devicekit/DeviceKit
24. blendShapes | Apple Developer Documentation. URL: https://developer.apple.com/documentation/arkit/arfaceanchor/2928251-blendshapes
25. init(target:action:) | Apple Developer Documentation. URL: https://developer.apple.com/documentation/uikit/uigesturerecognizer/1624211-init
26. Swift.org – Package Manager. URL: https://www.swift.org/package-manager/
27. Optimizing ProMotion Refresh Rates for iPhone 13 Pro and iPad Pro | Apple Developer Documentation. URL: https://developer.apple.com/library/archive/technotes/tn2460/_index.html
28. GitHub – ReQEnoxus/gaze-tracker: UIGestureRecognizer extension based on GazeTracking. URL: https://github.com/ReQEnoxus/gaze-tracker
This work is licensed under a Creative Commons Attribution 4.0 International License.
Presenting an article for publication in the Russian Digital Libraries Journal (RDLJ), the authors automatically give consent to grant a limited license to use the materials of the Kazan (Volga) Federal University (KFU) (of course, only if the article is accepted for publication). This means that KFU has the right to publish an article in the next issue of the journal (on the website or in printed form), as well as to reprint this article in the archives of RDLJ CDs or to include in a particular information system or database, produced by KFU.
All copyrighted materials are placed in RDLJ with the consent of the authors. In the event that any of the authors have objected to its publication of materials on this site, the material can be removed, subject to notification to the Editor in writing.
Documents published in RDLJ are protected by copyright and all rights are reserved by the authors. Authors independently monitor compliance with their rights to reproduce or translate their papers published in the journal. If the material is published in RDLJ, reprinted with permission by another publisher or translated into another language, a reference to the original publication.
By submitting an article for publication in RDLJ, authors should take into account that the publication on the Internet, on the one hand, provide unique opportunities for access to their content, but on the other hand, are a new form of information exchange in the global information society where authors and publishers is not always provided with protection against unauthorized copying or other use of materials protected by copyright.
RDLJ is copyrighted. When using materials from the log must indicate the URL: index.phtml page = elbib / rus / journal?. Any change, addition or editing of the author's text are not allowed. Copying individual fragments of articles from the journal is allowed for distribute, remix, adapt, and build upon article, even commercially, as long as they credit that article for the original creation.
Request for the right to reproduce or use any of the materials published in RDLJ should be addressed to the Editor-in-Chief A.M. Elizarov at the following address: amelizarov@gmail.com.
The publishers of RDLJ is not responsible for the view, set out in the published opinion articles.
We suggest the authors of articles downloaded from this page, sign it and send it to the journal publisher's address by e-mail scan copyright agreements on the transfer of non-exclusive rights to use the work.