Magnetic Remanence Prediction of NdFeB Magnets Based on a Novel Machine Learning Intelligence Approach Using a Particle Swarm Optimization Support Vector Regression

Magnetic Remanence Prediction of NdFeB Magnets Based on a Novel Machine Learning Intelligence Approach Using a Particle Swarm Optimization Support Vector Regression

WenDe Cheng (School of Mathematics and Physics, Chongqing University of Science and Technology, Chongqing, China)
DOI: 10.4018/IJSSCI.2014100105
OnDemand PDF Download:
List Price: $37.50
10% Discount:-$3.75


Studies have shown that the chemical compositions affecting the magnetic properties of NdFeB magnets. In order to get the right NdFeB magnets, it is advantageous to have an accurate model with which one can predict the magnetic properties with different components. In this paper, according to an experimental dataset on the magnetic remanence of NdFeB, a predicting and optimizing model using support vector regression (SVR) combined with particle swarm optimization (PSO) was developed. The estimated result of SVR agreed with the experimental data well. Test results of leave-one-out cross validation show that the mean absolute error does not exceed 0.0036, the mean absolute percentage error is solely 0.53%, and the correlation coefficient () is as high as 0.839. This implies that one can estimate an available combination of different proportion components by using support vector regression model to get suitable magnetic remanence of NdFeB.
Article Preview

1. Introduction

The rare earth permanent material NdFeB with low rare earth content, high magnetic remanence, relatively high coercivity and high maximum magnetic enegy product has become a research focus in recent years(Kneller, & Hawig, 1991; Skomski, & Coey,1993; Schrefl, Fidler, & Kronmuller,1994). The magnetic remanence of NdFeB is closely related to the alloying element content. Usually the magnetic remanence can be improved through optimization of alloying compositions(Jakubowicz, & Jurczyk, 2000; Jakubowicz, & Szlaferek,1999; Rieger, Seeger, Li, & Kronmuller,1995). But the common method is to change a kind of element content while keep the other alloying element unchanged, and then to find the change trend between this element and the magnetic remanence, and then by using the same method study how the other elements affect the magnetic remanence. But in this method the experimental work is heavy, and that the effect of interaction between various components on magnetic remanence cannot be considered at the same time. The relationship between the interaction of various components and the magnetic remanence is very complex and nonlinear. It is difficult to build an accurate theoretical method to predict the magnetic remanence. Support vector regression (SVR), proposed by Vapnik and coworker in 1995, is a new powerful machine learning theory based on structural risk minimization principle(Vapnik, 1995, 1999). Due to its excellent performance such as fast-learning, global optimization and excellent generalization ability for small-sample, SVR has been developed to solve nonlinear regression issues(Cai, Han, Ji, Chen, & Chen, 2003; Cai, Han, Ji, & Chen, 2004; Cai, Wang,, & Chen, 2003; Cai, Wang, Sun, & Chen, 2003; Cai, Xiao, Tang, & Huang, 2013; Cai, Zhu, Wen, Pei, Wang, & Zhuang, 2010; Firat, Ozay, Onal, Oztekin, & Yarman Vural, 2013; Kharrat, Gasmi, Ben Messaoud, Benamrane, & Abid, 2011; Lin, & Pai, 2001; Pei, Cai, Zhu, & Yan, 2013; Tang, Cai, & Zhao, 2012; Wen, Cai, Liu, Pei, Zhu, &Xiao, 2009; Xiao, Cai, Tang, & Huang, 2013; Yi, Peng, & Li, 2012). In this paper the SVR model integrating leave-one-out cross validation (LOOCV) was build to predict the magnetic remanence of the NdFeB magnet combined with particle swarm optimization algorithm for its parameter optimization.

Complete Article List

Search this Journal:
Volume 15: 1 Issue (2023): Forthcoming, Available for Pre-Order
Volume 14: 4 Issues (2022): 1 Released, 3 Forthcoming
Volume 13: 4 Issues (2021)
Volume 12: 4 Issues (2020)
Volume 11: 4 Issues (2019)
Volume 10: 4 Issues (2018)
Volume 9: 4 Issues (2017)
Volume 8: 4 Issues (2016)
Volume 7: 4 Issues (2015)
Volume 6: 4 Issues (2014)
Volume 5: 4 Issues (2013)
Volume 4: 4 Issues (2012)
Volume 3: 4 Issues (2011)
Volume 2: 4 Issues (2010)
Volume 1: 4 Issues (2009)
View Complete Journal Contents Listing