The support vector machine (SVM)29–31, one of the dominant statistical learning algorithms was adopted as a core prediction algorithm. It seeks the optimum hyperplane with the highest margin between two groups18,40. Furthermore, it solves the problem of constraint optimization as described below
Subject to: , for all i. After involving the kernel function, the discriminant function of SVM took the following form
In this paper, the radial basis function kernel18,41 was applied to construct SVM classifier and given by, , where 42. As the benchmark dataset was highly imbalanced, different error cost (DEC)18 method had been used to tackle the class imbalance problem24,43. According to this approach, the SVM soft margin objective function was adjusted to allocate two costs for misclassification12, such as for the positive class instances and for the negative class instances
In Eq. (16), is the weight for the positive instances and is the weight for the negative instances and defined by
where M is the total number of elements, is the number of elements for the positive class, and is the number of elements for the negative class.
Do you have any questions about this protocol?
Post your question to gather feedback from the community. We will also invite the authors of this article to respond.