MusREL: A Utility-Weighted Multi-Strategy Relation Extraction Model-Based Intelligent System for Online Education

MusREL: A Utility-Weighted Multi-Strategy Relation Extraction Model-Based Intelligent System for Online Education

Zhen Zhu, Huaiyuan Lin, Dongmei Gu, Liting Wang, Hong Wu, Yun Fang
Copyright: © 2023 |Pages: 19
DOI: 10.4018/IJSWIS.329965
Article PDF Download
Open access articles are freely available for download

Abstract

In order to enhance the utility of online educational digital resources, the authors propose a practical and efficient multi-strategy relation extraction (RE) model in online education scenarios. First, the effective relation discrimination model is used to make relation predictions for non-structured teaching resources and eliminate the noise data. Then, they extract relations from different path strategies using multiple low-computational resources and efficient relation extraction strategies and use their proposed multi-strategy weighting calculator to weigh the relation extraction strategies to derive the final target relations. To cope with the low-resource relation extraction scenario, the relation extraction results are complemented by using prompt learning with a big model paradigm. They also consider the model to serve the commercial scenario of online education, and they propose a global rate controller to adjust and adapt the rate and throughput requirements in different scenarios, so as to achieve the best balance of system stability, computation speed, and extraction performance.
Article Preview
Top

The technique of relation extraction was first formally proposed in the 1990s, and after more than 30 years of development, the paradigm of relation extraction has experienced great changes. The earliest approach used rule-based template matching, in which manual rule templates played a crucial role, and linguists were the key to the task of relation extraction at that stage (Huffman, 1995; Kim & Moldovan, 1995). One of these approaches used trigger words (Califf & Mooney, 1997; Nakashole et al., 2012); another approach was based on dependent syntactic analysis (Fundel et al., 2006) and template matching based on lexical information and positional relations (Nédellec, 2005; Nebhi, 2013). The approaches used rule-based template matching are simple and efficient but can match only very limited relations. By the 2000s, traditional machine learning approaches had become popular, and feature engineering approaches such as SVM and maximum entropy were used extensively (Lafferty et al., 2001; Och et al., 2004). Researchers and domain experts used domain knowledge to extract features from the original corpus and then used traditional machine learning methods of classification and clustering to do relation prediction. Among them, supervised approaches became mainstream and achieved the best performance at that time. Some scholars have used syntactic tree kernel functions using syntactic dependencies instead of shallow string information and added lexical, syntactic semantic labels and dependencies as supplementary features of the kernel functions (Culotta & Sorensen, 2004; Zhou et al., 2007). As a result, the approach of tree kernels trained on Penn Tree Bank achieved more promising results at that time (Zhang et al., 2008). Distanted-supervision is another approach to relation extraction. This approach is based on a hypothesis that if two entities have some relations in the knowledge base, unstructured sentences containing those two entities will contain these relations (Mintz et al., 2009); this approach can complement the existing structured data, but the noise problem has been an obstacle to the application of this approach (Riedel et al., 2010). The unsupervised machine learning approach is another way of completing the RE task, and a K-means clustering statistical machine learning approach has been proposed, which provides a new idea for relation extraction task (Chen et al., 2005).

Complete Article List

Search this Journal:
Reset
Volume 20: 1 Issue (2024)
Volume 19: 1 Issue (2023)
Volume 18: 4 Issues (2022): 2 Released, 2 Forthcoming
Volume 17: 4 Issues (2021)
Volume 16: 4 Issues (2020)
Volume 15: 4 Issues (2019)
Volume 14: 4 Issues (2018)
Volume 13: 4 Issues (2017)
Volume 12: 4 Issues (2016)
Volume 11: 4 Issues (2015)
Volume 10: 4 Issues (2014)
Volume 9: 4 Issues (2013)
Volume 8: 4 Issues (2012)
Volume 7: 4 Issues (2011)
Volume 6: 4 Issues (2010)
Volume 5: 4 Issues (2009)
Volume 4: 4 Issues (2008)
Volume 3: 4 Issues (2007)
Volume 2: 4 Issues (2006)
Volume 1: 4 Issues (2005)
View Complete Journal Contents Listing