Page 272 - Contributed Paper Session (CPS) - Volume 4
P. 272
CPS2222 Abdullah M.R. et al.
2.1 Radial Basis Function (RBF)
The Gaussian Radial Basis, is the commonly used type of kernel which is
given by
2
‖x − x ‖
j
K(x, x ) = exp [− ]
j
2h 2
where is the explanatory variable, is the fractions of and ℎ is the
bandwidth kernel function. According to Rana et al. (2018), outliers can be
detected for only variables by using cut-off-points as follows:
CP RBF = 2Med|Z | + 2√var(Med)
where
= ()
As this approach involves detecting all the outlier points by applying only
with one iteration, the computational cost would be less than those of the
conventional techniques. Additionally, it is suitable for non-expert users
because it introduces fixed set of parameters. In the experimental result
sections, the RBF kernel function is utilized with (ℎ = 1, = 0, = 10000),
using the predicted values to detect outliers.
3. Nu- Support Vector Regression
Another class of learning calculation, spurred by consequences by the
results of statistical learning theory (Vapnik, 1995) has been involved by
Support Vector (SV) machines. They represent the decision boundary in terms
of a typically small subset (Schölkopf et aI., 1995) of all training examples,
called the Support Vectors which is initially created for example
acknowledgment. Vapnik devised the so-called ɛ −insensitive loss function,
according to (Deng et al, 2012) can be handle ε-SVR during a similar approach.
−SVR is changed because the equivalent −support vector regression
( −SVR), wherever the parameter ε is replaced by a meaningful parameter .
0 | − ()| ≤ ,
( ) = { | − ()| −
ℎ
which does not penalize errors below some > 0, chosen a priori in order for
this property to carry over to the case of SV Regression (Schölkopf et al, 1999).
The primary issue of −SVR based on (Chang et al, 2002) to Introducing the
corresponding kernel (, ∗) = ((). ( ∗)) and can be rewritten as
follows
ℓ
‖‖ 2 1
∗
+ ( + ∑( − ))
2 ℓ
=1
((, ) + ) − ≤ +
261 | I S I W S C 2 0 1 9