Neuro-Symbolic Approach to Augmented Text Generation via Automated Induction of Morphotactic Rules

Main Article Content

Abstract

The work presents a hybrid neuro-symbolic method that combines a large language model (LLM) and a finite-state transducer (FST) to ensure morphological correctness in text generation for agglutinative languages. The system automatically extracts rules from corpus data: for local examples of word forms, the LLM produces sequences of morphological analyses, which are then aggregated and organized into compact descriptions of morphotactic rules (LEXC) and allomorph selection (regex). During generation, the LLM and FST operate jointly: if a token is not recognized by the automaton, the LLM derives a “lemma+tags” pair from the context, and the FST produces the correct surface form. A literary corpus (~1600 sentences) was used as the dataset. For a list of 50 nouns, 250 word forms were extracted. Using the proposed algorithm, the LLM generated 110 context-sensitive regex rules along with LEXC morphotactics, from which an FST was compiled that recognized 170/250 forms (~70%). In an applied machine translation test on a subcorpus of 300 sentences, integrating this FST into the LLM cycle improved quality from BLEU 16.14 / ChrF 45.13 to BLEU 25.71 / ChrF 50.87 without retraining the translator. The approach scales to other parts of speech (verbs, adjectives, etc.) as well as to other agglutinative and low-resource languages, where it can accelerate the development of lexical and grammatical resources.

Article Details

How to Cite
Isangulov, M. V., A. M. Elizarov, A. R. Kunafin, A. R. Gatiatullin, and N. A. Prokopyev. “Neuro-Symbolic Approach to Augmented Text Generation via Automated Induction of Morphotactic Rules”. Russian Digital Libraries Journal, vol. 28, no. 5, Dec. 2025, pp. 1085-02, doi:10.26907/1562-5419-2025-28-5-1085-1102.

References

1. Sproat R., Østling R. The morphological gap between translation quality and surface accuracy // Proceedings of the WMT 2020 Conference. Online, 2020. P. 1015–1024.
2. Kann K., Cotterell R., Schütze H. Neural models of inflectional morphology // Proceedings of the 15th Conference of the European Chapter of the ACL (EACL 2017). Valencia, 2017. P. 322–334.
3. Mielke S., Eisenstein J., Cotterell R. Dialect-to-dialect translation and cross-dialect morphological robustness of language models // Transactions of the ACL. 2021. Vol. 9. P. 288–302.
4. Koskenniemi K. Two-level morphology: a general computational model for word-form recognition and production. Helsinki: University of Helsinki, Department of General Linguistics, 1983. 38 p.
5. Beesley K.R., Karttunen L. Finite-State Morphology. Stanford (CA): CSLI Publications, 2003. 550 p.
6. Stahlberg F., Hasler E., Waite A. SGNMT: A flexible NMT decoding toolkit for quick prototyping of new models // Proceedings of ACL System Demonstrations. Vancouver, 2017. P. 67–72.
7. Hulden M. FST-based grammar correction for richly inflected languages // Proceedings of ACL Workshop on Finite-State Methods. Montréal, 2012. P. 32–39.
8. Tamchyna A., Bojar O. Target-side context for morphological reinflection // Proceedings of the First Conference on Machine Translation (WMT 2016). Berlin, 2016. P. 586–594.
9. Schwartz L., Liu S., Surrain S. Bootstrapping a neural morphological analyzer from an existing FST // Proceedings of the ACL Workshop on Morphological Resources 2022. Seattle, 2022. P. 12–20.


Most read articles by the same author(s)