Chuanjun Zhao is an associate professor at the Shanxi University of Finance and Economics, a master tutor for computer application technology, and a member of the affective computing professional committee of the Chinese Information Processing Society of China. He received a Ph.D. degree in systems engineering from Shanxi University in 2018. Dr. Chuanjun Zhao is a member of IEEE, ACM, and CCF. His main research interests are data mining and natural language processing. He has published many papers in journals such as Information Science, Computer Speech and Language, Knowledge-based Systems, Computer Research and Development, and Journal of Software. His research is supported by the national natural science foundation of China, the Shanxi application basic research plan, and the scientific and technological innovation programs of higher education institutions in Shanxi.
Speech Title: Cross-domain sentiment classification
Abstract: Training data in a specific domain are often insufficient in the area of text sentiment classifications. Cross-domain sentiment classification (CDSC) is usually utilized to extend the application scope of transfer learning in text-based social media and effectively solve the problem of insufficient data marking in specific domains. Hence, this paper aims to propose a CDSC method via parameter transferring and attention sharing mechanism (PTASM), and the presented architecture includes the source domain network (SDN) and the target domain network (TDN). First, hierarchical attentional network with pre-training language model on training data, such as global vectors for word representation and bidirectional encoder representations from transformers (BERT), are constructed. The word and sentence levels of parameter transferring mechanisms are introduced in the model transfer. Then, parameter transfer and fine-tuning techniques are adopted to transfer network parameters from SDN to TDN. Moreover, sentiment attention can serve as a bridge for sentiment transfer across different domains. Finally, word and sentence level attention mechanisms are introduced, and sentiment attention is shared from the two levels across domains. Extensive experiments show that the PTASM-BERT method achieves state-of-the-art results on Amazon review cross-domain datasets.