Short Text Classification is the fundamental task in the nature language processing. There is a lack of language structure and uneven classification of data samples in short texts, which limit the development of deep learning based short text classification. To address the limitations of text sequences, we propose using a large-scale pre-trained language model Bert to obtain feature information between words and bureaus in the text, Graph Convolutional Network (GCN) with double-layer convolutional network can obtain the dependency relationships between words. We propose to combine Bert with GCN in short Chinese medical texts, where BertGCN outperforms better than other’s methods in classification accuracy.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.