lecture:2019/6/27-意昂体育

date:2019-06-27views:198

title:nonlinear smooth support vector machines i, ii

nonlinear smooth support vector machines,

  

time:6/27, 14:00-16:00)

abstract:

(1) review of optimization problems with constraints

----primal form, dual form, karush-kuhn-tuker (kkt) conditions.

----tangent vectors to feasible set and linearized feasible directions.

  

(2) binary classification problems/supervised learning problems

----linearly separable case: maximizing the margin between boundary planes, primal and dualforms.

----nonseparable case: primal/dual maximization problems for 1-norm/2-norm soft margin svm.

  

(3) nonlinear support vector machine

----two spiral data set.

----learning linear machine in feature space.

----kernel: represent inner product in feature space.

----kernel techniques: monomials of degree d, polynomial kernel, guassian (radial basis function) kernel.

----dual representation of svm classifier.

  

(4) smooth support vector machine

----svm as an unconstrained minimization problem.

----smooth with plus function.

----newton-armijo algorithm.

  

(5) nonlinear smooth support vector machine

----nonlinear ssvm motivation.

----kernel trick: gaussian kernel, monomials, polynomials.

----nonlinear classifier.

  

(6) reduced support vector machine

----reduced svm: a compressed model.

----a nonlinear kernel application: checkerboard training set.

----using 50 randomly selected points out of 1000 points.

----compressed model vs full model.

  

  

  

  

  

  

  

  

   

  

title:clustering and expectation/maximization algorithms i, ii

    

(time6/28, 14:00-16:00)

  

abstract:

(1) searching the optimal combination of the regularization parameter and the width parameter in the gaussian kernel

----grid search, nested uniform design method (udm).

----experimental results: grid search vs udm (13/9) vs udm(9/5).

  

(2) three fundamental algorithms

----naive bayes classifier.

----k-nearest neighbors algorithm.

----online perception algorithm.

  

(3) unsupervised learning problems

----k-means clustering problem formulation.

----k-means algorithm.

----k-means algorithm.

  

(4) expectation/maximization algorithm

----e-step: compute the probability, the point n is generated by distribution k.

----m-step: update mean, variance and probability distribution k.


网站地图