A Combined Dimensional Kernel Method for Graph Classification

A Combined Dimensional Kernel Method for Graph Classification

Tiejun Cao (School of Electrical and Information Engineering, Hunan International Economics University, Changsha, China)
Copyright: © 2017 |Pages: 12
DOI: 10.4018/JITR.2017070102
OnDemand PDF Download:


The data containing structural information is an important problem in the field of machine learning. Kernel methods is an effective technique for solving such problems. A combined dimension kernel method is proposed or graph classification in this paper. A two-dimensional kernel is first constructed in this method, and it incorporates one-dimensional information to characterize the molecular chemistry, and then a three-dimensional kernel is constructed based on the knowledge of molecular mechanics to characterize the physical properties of the molecule. On this basis, the kernel of different dimensions is integrated, and the quadratic programming problem with quadratic constraints is solved to obtain the optimal kernel combination. The experimental results show that the proposed method has better performance than the prior technology, and it outperforms the existing algorithm.
Article Preview

1. Introduction

Many of the commonly used classification algorithms assume that the sample is represented by a vector, whereas the sample is usually structured in many practical applications, such as biology RNA, DNA sequence representation (Durbin, Eddy, Krogh et al., 1998; Zeng, 2015; Gao, Cheng et al., 2016), natural language processing tree representation (Manning & Schütze, 1999), XML semi-structured representation (Abiteboul, Buneman, & Suciu, 2000) and chemical molecules map representation (Swamidass, Chen, Bruand et al., 2005; Xu, Peng et al., 2016) and so on. How to use the structural information contained in these samples to effectively learn, is an important problem in the field of machine learning. The results of this work play a role in toxicity detection, mutagenesis, carcinogenic detection of chemical molecules and many other applications (Kramer & De Raedt, 2001; Inokuchi, Washio, & Motoda, 2000; Wang, Qin et al., 2012).

The kernel method can guarantee the generalization performance by mapping the data into the high-dimensional feature space and optimizing the structural risk in the feature space. Because this method only needs to construct a kernel function to measure the similarity between the samples, and it is not limited by the specific representation of the sample, it can effectively learn structural data such as graph (Scholkopf & Smola, 2002).

A suitable kernel function is constructed for a given learning task, it is the core problem of the kernel approach. The difficulty lies in the need to efficiently and rationally use the sample information to describe the similarity of samples, but also to meet the semi - positive definite of the matrix and to get the optimal solution. As far as the structure of chemical molecules is concerned, many researchers have proposed a variety of kernel functions, such as random path kernel (Gartner, Flach, & Wrobel, 2003), best matching kernel (Frohlich, Wegner, Sieker et al., 2005), interval kernel (Kashima, Tsuda, & Inokuchi, 2003), and so on. Recently, Swamidass et al. had designed a family of kernel functions for one-dimensional (1D), two-dimensional (2D), and three-dimensional (3D) representations of chemical molecules, respectively, with knowledge domain. However, this method only considers the chemical characteristics of the molecule, it ignored the physical characteristics of the molecule (Swamidass, Chen, Bruand et al., 2005).

Quadratic Constrained Quadratic Quadratic Programming (QCQP) is introduced in this paper to generate the optimal combination of 2D and 3D kernels. Experiments show that the proposed method has better performance than the existing methods on the 10 data sets of PTC and NCI, and the performance of Mutag data sets is equal to that of the existing algorithms.

Complete Article List

Search this Journal:
Open Access Articles: Forthcoming
Volume 10: 4 Issues (2017)
Volume 9: 4 Issues (2016)
Volume 8: 4 Issues (2015)
Volume 7: 4 Issues (2014)
Volume 6: 4 Issues (2013)
Volume 5: 4 Issues (2012)
Volume 4: 4 Issues (2011)
Volume 3: 4 Issues (2010)
Volume 2: 4 Issues (2009)
Volume 1: 4 Issues (2008)
View Complete Journal Contents Listing