Accès libre

Unleashing the potential: harnessing generative artificial intelligence for empowering model training

,  et   
22 juil. 2024
À propos de cet article

Citez
Télécharger la couverture

Anagnoste, S. (2018). Robotic Automation Process - The operating system for the digital enterprise. Proceedings of the International Conference on Business Excellence, 12(1), 54-69. doi:https://doi.org/10.2478/picbe-2018-0007 Search in Google Scholar

Anagnoste, S. (2024, March). Today’s GenAI capabilities have been developing since 2009. OnStrategy. Bucharest: OnStrategy. Search in Google Scholar

Arcila, B. B. (2023). Is it a Platform? Is it a Search Engine? It’s Chat GPT! The European Liability Regime for Large Language Models. J. Free Speech L., 3, 455. Search in Google Scholar

Bezko, G. (2023, December 7). Understanding AI, ML & Co. in Contact Centers: Definitions and Explanations. Retrieved from Miarec: https://blog.miarec.com/contact-centers-ai-definition Search in Google Scholar

Boom, C. D., Canneyt, S. V., Demeester, T., & Dhoedt, B. (2016). Representation learning for very short texts using weighted word embedding aggregation. Pattern Recognit. Lett., 150–156. Search in Google Scholar

Cai, Y., Mao, S., Wu, W., Wang, Z., Liang, Y., Ge, T., . . . Duan, N. (2023). Low-code LLM: Visual Programming over LLMs. (C. University, Ed.) arXivpreprintarXiv:2304.08103. doi:https://doi.org/10.48550/arXiv.2304.08103 Search in Google Scholar

Calzone, O. (2022, February 21). An Intuitive Explanation of LSTM. Retrieved from Medium: https://medium.com/@ottaviocalzone/an-intuitive-explanation-of-lstm-a035eb6ab42c Search in Google Scholar

Caprasi, C. (2023, July 21). Artificial Intelligence, Machine Learning , Deep Learning, GenAI and more. Retrieved from Medium - Women in Technology: https://medium.com/womenintechnology/ai-c3412c5aa0ac Search in Google Scholar

Chui, M., Hazan, E., Roberts, R., Singla, A., Smaje, K., Sukharevsky, A., . . . Zemmel, R. (2023). The economic potential of generative AI: The next productivity frontier. McKinsey & Company, McKinsey Digital. Retrieved from https://www.mckinsey.com/capabilities/mckinsey-digital/our-insights/the-economic-potential-of-generative-ai-the-next-productivity-frontier#introduction Search in Google Scholar

Data Base Camp. (2022, June 4). Long Short-Term Memory Networks (LSTM)- simply explained! Retrieved from Data Base Camp: https://databasecamp.de/en/ml/lstms Search in Google Scholar

Devlin, J., Chang, M.-W., Lee, K., & Toutanova, K. (2019). BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. arXiv:1810.04805v2. doi:https://doi.org/10.48550/arXiv.1810.04805 Search in Google Scholar

Fatima, N., Imran, A. S., Zenun Kastrati, S. M., & Soomro, A. (2022). A Systematic Literature Review on Text Generation Using Deep Neural Network Models. IEEE Access, 10. Search in Google Scholar

Feng, S. Y., Gangal, V., Wei, J., Chandar, S., Vosoughi, S., Mitamura, T., & Hovy, E. (2021). A Survey of Data Augmentation Approaches for NLP. Findings of the Association for Computational Linguistics: ACL-IJCNLP, 968–988. Search in Google Scholar

Gartner. (2023, July 31). Gartner Says Conversational AI Capabilities Will Help Drive Worldwide Contact Center Market to 16% Growth in 2023. Retrieved from Gartner : https://www.gartner.com/en/newsroom/press-releases/2023-07-31-gartner-says-conversational-ai-capabilities-will-help-drive-worldwide-contact-center-market-to-16-percent-growth-in-2023 Search in Google Scholar

Gartner. (2023). Generative AI. Retrieved from Gartner : https://www.gartner.com/en/information-technology/glossary/generative-ai Search in Google Scholar

Gruetzemacher, R. (2022). The Power of Natural Language Processing. AI And Machine Learning. Retrieved from https://hbr.org/2022/04/the-power-of-natural-language-processing Search in Google Scholar

Hochreiter, S., & Schmidhuber, J. (1997). Long Short-Term Memory. Neural Computation, 9(8), 1735-1780. doi:https://doi.org/10.1162/neco.1997.9.8.1735 Search in Google Scholar

IBM. (2023, July 6). AI vs. Machine Learning vs. Deep Learning vs. Neural Networks: Whats the difference? Retrieved from IBM: https://www.ibm.com/blog/ai-vs-machine-learning-vs-deep-learning-vs-neural-networks/ Search in Google Scholar

Jain, R., Gervasoni, N., Ndhlovu, M., & Rawat, S. (2023). A Code Centric Evaluation of C/C++ Vulnerability Datasets for Deep Learning Based Vulnerability Detection Techniques. Proceedings of the16th Innovations in Software Engineering Conference, 1–10. Search in Google Scholar

Kang, H., Wu, H., & Zhang, X. (2020). Generative Text Steganography Based on LSTM Network and Attention Mechanism with Keywords. Electronic Imaging. Search in Google Scholar

Kumar, T. S. (2022, August 26). Natural Language Processing – Sentiment Analysis using LSTM. Retrieved from Analytics Vidhya: https://www.analyticsvidhya.com/blog/2021/06/natural-language-processing-sentiment-analysis-using-lstm/ Search in Google Scholar

Lewis, M., Liu, Y., Goyal, N., Ghazvininejad, M., Mohamed, A., Levy, O., . . . Zettlemoyer, L. (2019). BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension. arXiv preprint arXiv:1910.13461. Search in Google Scholar

Li, X., Zhu, X., Ma, Z., Liu, X., & Shah, S. (2023). Are ChatGPT and GPT-4 General-Purpose Solvers for Financial Text Analytics? An Examination on Several Typical Tasks. Cornell University Computer Science - Computation and Language. doi:https://doi.org/10.48550/arXiv.2305.05862 Search in Google Scholar

Lindemann, B., Müller, T., Vietz, H., Jazdi, N., & Weyrich, M. (2021). A survey on long short-term memory networks for time series prediction. Procedia CIRP 99, 650-655. Search in Google Scholar

Liu, Y., & Lapata, M. (2018). Learning Structured Text Representations. Trans. Assoc. Comput. Linguist., 6, 63-75. doi:https://doi.org/10.48550/arXiv.1705.09207 Search in Google Scholar

Luaran, N., & Alfred, R. (2022). Assessment of the Optimization of Hyperparameters in Deep LSTM for Time Series Sea Water Tidal Shift. Research Square . Search in Google Scholar

Medium. (2019). Recurrent Neural Network and Long Term Dependencies. Retrieved from Medium: https://infolksgroup.medium.com/recurrent-neural-network-and-long-term-dependencies-e21773defd92 Search in Google Scholar

Mungalpara, J. (2022, July 26). Stemming Lemmatization Stopwords and N-Grams in NLP. Retrieved from Medium: https://jaimin-ml2001.medium.com/stemming-lemmatizationstopwords-and-n-grams-in-nlp-96f8e8b6aa6f Search in Google Scholar

Naveed, H., Khan, A. U., Qiu, S., Saqib, M., Anwar, S., Usman, M., . . . Mian, A. (2024). A Comprehensive Overview of Large Language Models. Preprint. Retrieved from https://doi.org/10.48550/arXiv.2307.06435 Search in Google Scholar

Noaman, H. M., Sarhan, S. S., & Rashwan, M. A. (2018). Enhancing recurrent neural network-based language models by word tokenization. Human-centric Computing and Information Sciences, 8(12). doi:https://doi.org/10.1186/s13673-018-0133-x Search in Google Scholar

Novelli, C., Casolari, F., Rotolo, A., Taddeo, M., & Floridi, L. (2023). Taking AI Risks Seriously: a Proposal for the AI Act. AI & SOCIETY, 1-5. Search in Google Scholar

Olah, C. (2015). Understanding LSTM Networks. Retrieved from colah’s blog: https://colah.github.io/posts/2015-08-Understanding-LSTMs/ Search in Google Scholar

Peters, M. E., Neumann, M., Iyyer, M., Gardner, M., Clark, C., Lee, K., & Zettlemoyer, L. (2018, June). Deep Contextualized Word Representations. (M. Walker, H. Ji, & A. Stent, Eds.) Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers), 2227–2237. doi:10.18653/v1/N18-1202 Search in Google Scholar

Razin, M. J., Karim, M. A., Mridha, M. F., Rifat, S. M., & Alam, T. (2021). A Long Short-Term Memory (LSTM) Model for Business Sentiment Analysis Based on Recurrent Neural Network. Sustainable Communication Networks and Application. doi:https://doi.org/10.1007/978-981-15-8677-4_1 Search in Google Scholar

Sak, H., Senior, A., & Beaufays, F. (2014). Long Short-Term Memory Recurrent Neural Network Architectures for Large Scale Acoustic Modeling. INTERSPEECH, 338-342. Search in Google Scholar

Săniuță, A., & Filip, S.-O. (2021). Artificial Intelligence: An Overview of European and Romanian Startups Landscape and the Factors that Determine their Success. Strategica. Shaping the Future of Business and Economy, 872-884. Search in Google Scholar

Thirunavukarasu, A. J., Ting, D. S., Elangovan, K., Gutierrez, L., Tan, T. F., & Ting, D. S. (2023). Large language models in medicine. Nature medicine, 29(8), 1930–1940. Search in Google Scholar

Wang, C.-F. (2019, January 8). The Vanishing Gradient Problem. (T. D. Science, Editor) Retrieved March 1, 2024, from Medium: https://towardsdatascience.com/the-vanishing-gradient-problem-69bf08b15484 Search in Google Scholar

Wolfram, S. (2023, February 14). What Is ChatGPT Doing … and Why Does It Work? Retrieved from https://writings.stephenwolfram.com/2023/02/what-is-chatgpt-doing-and-why-does-it-work/ Search in Google Scholar

Yao, Y., Duan, J., Xu, K., Cai, Y., Sun, Z., & Zhang, Y. (2023). A Survey on Large Language Model (LLM) Security and Privacy: The Good, the Bad, and the Ugly. Preprint submitted to Elsevier. Search in Google Scholar

Zhanga, W., Lia, Y., & Wang, S. (2019). Learning document representation via topic-enhanced LSTM model. Knowledge-Based Systems, 174, 194–204. doi: https://doi.org/10.1016/j.knosys.2019.03.007 Search in Google Scholar

Zhao, W. X., Zhou, K., Li, J., Tang, T., Wang, X., Hou, Y., . . . Wen, J.-R. (2023). A Survey of Large Language Models. (C. University, Ed.) doi: https://doi.org/10.48550/arXiv.2303.18223 Search in Google Scholar