Accès libre

Research on Multi-Domain Intelligent Customer Service Dialog Modeling with Integrated Transfer Learning Strategies

À propos de cet article

Citez

The intelligent customer service dialog model is centered on human-machine dialog, which has good prospects for commercial applications in multiple domains. In this paper, we use the Siamese-LSTM model to do vectorization of questions in the FAQ question and answer database to get the semantic representation vector of sentences, and then use the approximate retrieval algorithm to index the question and answer database and perform approximate nearest-neighbor retrieval of the query. After completing the question query, migration learning is employed to create a mapping between input questions and human responses, enabling the model to produce sentences that are similar to human responses. Tests show that the task success rate gradually stabilizes around 0.80 at about the 100th round and fluctuates up to around 0.986 after that. For the average number of conversation rounds, migration learning improves the conversation efficiency of intelligent customer service, and the average number of conversation rounds gradually stabilizes at about 150 rounds and eventually stabilizes at about 4.2 rounds as the number of training rounds increases. The transfer learning strategy helps machine responses to be as close to human responses as possible.

eISSN:
2444-8656
Langue:
Anglais
Périodicité:
Volume Open
Sujets de la revue:
Life Sciences, other, Mathematics, Applied Mathematics, General Mathematics, Physics