Uneingeschränkter Zugang

Research on Improved Dual Channel Medical Short Text Intention Recognition Algorithm


Zitieren

Figure 1.

Architecture diagram of AB-CNN-BGRU-att model
Architecture diagram of AB-CNN-BGRU-att model

Figure 2.

ALBERT model structure
ALBERT model structure

Figure 3.

GRU network structure
GRU network structure

Figure 4.

BiGRU network structure
BiGRU network structure

Figure 5.

Improved TextCNN model structure
Improved TextCNN model structure

Figure 6.

Model validation results
Model validation results

Figure 7.

Comparison of Network Time
Comparison of Network Time

comparison of experimental results

Model Acc% Pre% Recall% F1%
SAttBiGRU 96.16 96.20 96.16 96.17
Self-Attention-CNN 94.85 94.89 94.85 94.85
BiGRU-MCNN 95.43 95.45 95.43 95.43
MC-AttCNN-AttBiGRU 95.93 95.98 95.93 95.93

experimental dataset

Name Training Set Test Set Validation Set Category Total
KUAKE-QIC 6931 1994 1955 11 10880
THUCNews_Title 180000 10000 10000 10 200000

results of ablation experiment

Model Acc% Pre% Recall% F1%
TextCNN 89.96 89.90 89.96 89.90
Improved TextCNN 94.85 94.89 94.85 94.85
BiGRU-att 94.00 94.17 94.00 94.90
AB-CNN-BGRU-att 96.68 96.68 96.67 96.67

comparison between bigru-att and bilstm-att

Network Layer Average Duration Total Duration Acc% F1%
BiGRU-att 2286.9s 45738s 90.83 90.64
BiLSTM-att 2422.55s 48451s 90.45 90.41
eISSN:
2470-8038
Sprache:
Englisch
Zeitrahmen der Veröffentlichung:
4 Hefte pro Jahr
Fachgebiete der Zeitschrift:
Informatik, andere