Page 273 - Contributed Paper Session (CPS) - Volume 4
P. 273
CPS2222 Abdullah M.R. et al.
− ((, ) + )) − ≤ +
∗
≥ 0, ≥ 0
∗
Here and beneath, it is comprehended that = 1, . . . , ℓ, and that bold
Greek letters signify ℓ −dimensional vectors of the relating factors. Presenting
a Lagrangian with multipliers , ղ , ≥ 0, we get the Wolfe double issue. In
∗
∗
addition, based on Boser et al. (1992), we substitute a bit k for the dot item,
comparing to a dot item in some element space identified with info space by
means of a nonlinear guide Φ,
(, ) = ((). ()).
This prompts the − Enhancement Issue: for ≥ 0, > 0.
ℓ ℓ
1
∗ ∗
∗
∗
∗
( ) = ∑( − ) − ∑( − )( − ) ( , )
2
=1 =1
Subject to
ℓ ℓ
∗
∗
∗
∑ ( − ) = 0, 0 ≤ ≤ 0, ∑ ( − ) ≤ .
=1 =1
Therefore, the last regression function of the Nu-SVR and the load vector can
be defined as follows
ℓ
() = ∑( − )( ) +
∗
·
=1
Another point for consideration is the characteristic of kernel functions.
According to Williams (2011), the SVM algorithm is sensitive to the tuning
choice (the type of kernel), so it is important to understand how kernel
function works.
3.1 Bessel Function
Bessel function are one type of kernel, which is given by
(‖ − ‖)
( , ) = (+1)
(‖ − ‖) −(+1)
Outliers can be detected by constricting a cut-off-points for any the cut-off
points for Nu-SVR based on Bessel function is given by
= 1.05 ∗ | | + 3( )
where
( ) = {| − ( )|}
As this approach involves detecting all the outlier points by applying it one
iteration, the computational cost would be less than those of the conventional
techniques. In the experimental result sections, the Bessel kernel function is
utilized using the predicted values to detect outliers.
262 | I S I W S C 2 0 1 9