Generation of Python syntax of Turkish verbal expressions with Deep Learning Approaches
Abstract
In this study, it is aimed to automatically generate the syntactic expression in the python programming language from Turkish verbal expressions. The study was carried out, especially in order to facilitate the learning of python code. It is aimed to provide a great convenience for teachers who give coding lessons in education-teaching environments. At the same time, it will provide the opportunity to learn coding quickly and easily for the students who take the course. A new approach has been proposed to achieve this aim. Verbally spoken sentences in Turkish are first translated into text. The transcription process was carried out using the "speech recognition" library in the python language. Turkish sentences translated into text are translated into English using deep learning algorithms such as Long Short Term Memory (LSTM), Bidirectional LSTM (BiLSTM), and Gated Recurrent Unit (GRU). The Encoder-Decoder model, which includes the Sequence to sequence (Seq2seq) architecture, was used for machine translation. The Encoder-Decoder model converts the sentences it receives in Turkish at the input to English at the output. The sentences translated into English are then given as input to the Generative Pre-trained Transformer-3 (GPT-3) model. The GPT-3 model is a pre-trained language model made by OpenAI. The GPT3 model generates the python code by taking the English sentence that comes to itself. This study is the first to produce a python code from Turkish verbal expressions. No study has been found on this. Although there is machine translation from one language to another, there is no study that finds the python code of a verbal expression in Turkish. This aspect constitutes the innovative side of our work. For the study to be used by end users, both web-based and windows-based applications were developed using the "Django" and "PyQt5" libraries. The Bilingual Evaluation Understudy (BLEU) metric was used to evaluate the results of the study. In the calculations, the BiLSTM model produced higher score values than the other models.
Keywords
Full Text:
PDFReferences
K. S. Taşpolot SADIKOV, “MAKİNE ÇEVİRİ YÖNTEMLERİ VE MAKİNE ÇEVİRİSİNİN BUGÜNKÜ DURUMU,” Uluslararası Türkçe Edebiyat Kültür Eğitim Dergisi, pp. 192–205, 2021.
Shanghai hai yang da xue and Institute of Electrical and Electronics Engineers, “Research Progress of RNN Language Model,” IEEE International Conference on Artificial Intelligence and Computer Applications (ICAICA), Jun. 2020.
H. Liang, «Research on Pre-training Model of Natural Language Processing Based on Recurrent Neural Network,» %1 içinde 2021 IEEE 4th International Conference on Information Systems and Computer Aided Education (ICISCAE), Dalian, China, 2021.
G. Szucs and D. Huszti, “Seq2seq deep learning method for summary generation by lstm with two-way encoder and beam search decoder,” SISY 2019 - IEEE 17th International Symposium on Intelligent Systems and Informatics, Proceedings, pp. 221–225, Sep. 2019, doi: 10.1109/SISY47553.2019.9111502.
Wu, K., Wu, J., Feng, L., Yang, B., Liang, R., Yang, S., & Zhao, R. (2021). An attention-based CNN-LSTM-BiLSTM model for short-term electric load forecasting in integrated energy system. International Transactions on Electrical Energy Systems, 31(1). https://doi.org/10.1002/2050-7038.12637
S. Squartini, A. Hussain, and F. Piazza, “Preprocessing based solution for the vanishing gradient problem in recurrent neural networks,” Proceedings - IEEE International Symposium on Circuits and Systems, vol. 5, 2003, doi: 10.1109/ISCAS.2003.1206412.
S. R. M. R. K. Neha Shivhare, «Automatic Speech Analysis of Conversations for Dementia Detection Using LSTM and GRU,» %1 içinde 2021 International Conference on Computational Intelligence and Computing Applications (ICCICA), Nagpur, India, 2021.
Wu, K., Wu, J., Feng, L., Yang, B., Liang, R., Yang, S., & Zhao, R. (2021). An attention-based CNN-LSTM-BiLSTM model for short-term electric load forecasting in integrated energy system. International Transactions on Electrical Energy Systems, 31(1). https://doi.org/10.1002/2050-7038.12637 “GPT-3 - Vikipedi.” https://tr.wikipedia.org/wiki/GPT-3 (accessed Jun. 22, 2022).
M. Y. X. H. X. Y. Q. Y. Wentao Yu, «Make It Directly: Event Extraction Based on Tree-LSTM and Bi-GRU,» IEEE, cilt 8, pp. 14344 - 14354, 2020.
“GPT-3 - Vikipedi.” https://tr.wikipedia.org/wiki/GPT-3 (accessed Jun. 22, 2022).
K. S. Shruti Saravanan, «GPT-3 Powered System for Content Generation and Transformation,» %1 içinde 2022 Fifth International Conference on Computational Intelligence and Communication Technologies (CCICT), Sonepat, India, 2022.
Y. Yuan et al., “Using An Attention-Based LSTM Encoder-Decoder Network for near Real-Time Disturbance Detection,” IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, vol. 13, pp. 1819–1832, 2020, doi: 10.1109/JSTARS.2020.2988324.
J. C. D.P. Mandic, «On the choice of parameters of the cost function in nested modular RNN's,» IEEE, vol 11, no. 2, pp. 315 - 322, 2000.
Akköse Onur, “Uzun-Kısa Vadeli Bellek(LSTM). LSTM’i daha derinden incelemeye… | by Onur Akköse | Deep Learning Türkiye | Medium,” medium, 2020. https://medium.com/deep-learning-turkiye/uzun-k%C4%B1sa-vadeli-bellek-lstm-b018c07174a3 (accessed Jun. 22, 2022).
KARAKAYA Murat, “SEQ2SEQ LEARNING. PART F: Encoder-Decoder with Bahdanau &… | by Murat Karakaya | Deep Learning Tutorials with Keras | Medium,” medium, 2020. https://medium.com/deep-learning-with-keras/seq2seq-part-f-encoder-decoder-with-bahdanau-luong-attention-mechanism-ca619e240c55 (accessed Jun. 22, 2022).
S. Qian, Y. Yu, L. Li, and Y. Chang, “An attention-based GRU encoder decoder for hostload prediction in a data center,” in 2021 International Conference on Computer Communication and Artificial Intelligence, CCAI 2021, May 2021, pp. 121–125. doi: 10.1109/CCAI50917.2021.9447455.
Zivkovic Strahinja, “RNN - Tackling Vanishing Gradients with GRU and LSTM,” Sep. 24, 2020. https://datahacker.rs/005-rnn-tackling-vanishing-gradients-with-gru-and-lstm/ (accessed Jun. 10, 2022).
H. X. Gang Yang, «A Residual BiLSTM Model for Named Entity Recognition,» IEEE, cilt 8, pp. 227710 - 227718, 2020.
Mungalpara aimin, “What does it mean by Bidirectional LSTM? | by Jaimin Mungalpara | Analytics Vidhya | Medium,” medium, Feb. 09, 2021. https://medium.com/analytics-vidhya/what-does-it-mean-by-bidirectional-lstm-63d6838e34d9 (accessed Jun. 22, 2022).
“GitHub - divyam96/English-to-Python-Converter: This is an attempt to use transformers and self attention in order to convert English descriptions into Python code.” https://github.com/divyam96/English-to-Python-Converter (accessed Jun. 22, 2022).
Article Metrics
Metrics powered by PLOS ALM
Refbacks
- There are currently no refbacks.

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.