Page 287 - Contributed Paper Session (CPS) - Volume 2
P. 287

CPS1857 Nicholas J. et al.
                                        = arg   min                      (7)
                                                          2
                                                          
                 So that the subset   is a subset with the median least squares error and
                                   ̂
                                    
                    is the error vector produced by that subset.
                  
             5.  Calculate the robust standard deviation ( ) that can be formulated as [2]:
                                                         0
                                                    5                              (8)
                                                                2
                                 = 1.4826  (1 +      ) √ 
                                 0
                                                   −       
                    where the constant 1.4826 is chosen to provide better efficiency for
                 the clean data with Gaussian noise, and the factor is to compensate the
                 effect of small sample size [2]. The robust standard deviation calculated
                 after each network training determines the border between outliers and
                 majority of the clean data. In the next step the data set is reduced, so the
                 training process can be more accurate.
             6.  Calculate  the  weight  of  ,  for  example  with   =  1, | | ≤  ()
                                                                        
                                          
                                                                
                                                                        0
                 and   =  |  | .
                           0
                      
             7.  Do the data fitting by the weighted least squares method with   to get
                                                                                
                 the final . The Weighted Least Squares (WLS) method is the same as the
                         ̂
                 Ordinary Least Square (OLS) method. The difference in the WLS method
                 is that there is a new additional variable  which denotes the diagonal
                 matrix containing  .The equation can be written [4]:
                                    
                                    = ( )                             (9)
                                    ̂
                                          
                                                −1
                                                    
                 where
                                         1  0   …   0                            (10)
                                         0      …   0
                                   = [      2       ⋱  ]
                                         ⋮    ⋮        ⋮
                                         0   0    …    
             8.  As the goal of the LMS Regression method is to minimize the outliers, we
                 will evaluate the models using the Cook’s distance method for detecting
                 the outliers. The Cook’s distance method can be formulated as [5]:
                                             2  ℎ                          (11)
                                      =     {      }
                                       
                                            1 − ℎ 

                    where    denotes  the  studentized  residuals  and  ℎ  denotes  the
                                                                        
                            
                 diagonal of the Hatt matrix. The Hatt matrix can be formulated as [5]:
                                       = ( )                             (12)
                                                 −1
                                              ′
                                                     ′



                                                                276 | I S I W S C   2 0 1 9
   282   283   284   285   286   287   288   289   290   291   292