Skip to content

Greedy Sparse Least-Squares SVM implementation (St. Petersburg Polytechnic State University)

Notifications You must be signed in to change notification settings

zhenyatos/gsls-svm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 

Repository files navigation

gsls-svm

Greedy Sparse Least-Squares (GSLS) SVM and how to use it in regression analysis. This algorithm was invented by Gavin C. Cawley and Nicola L.C. Talbot.

example

Requirements

There are no requirements for GSLS_SVM function from GSLS-SVM.jl

But if you want to test it with test.jl you should install:
Plots
Distributions

Launch

From Julia REPL:

julia> include("main.jl")

Description

As you might guess, GSLS is a Greedy algorithm. Its purpose is to construct Sparse approximation of the LS-SVM solution to the regularized least-squares regression problem. Given training data

data

where

domains

LS-SVM with kernel function

kernel

determines coefficients

coefficients

for the solution to the mentioned regression problem

solution

which minimises LS-SVM objective function.

We aim to find such an approximation (which we call sparse) that for some proper subset (which we call dictionary)

subset

coefficients

coefficients

of the function

function

will minimise GSLS SVM objective function

objective function

as much as possible. γ is the regularization parameter. At each iteration GSLS chooses some new vector from dataset as support vector, calculates value of the objective function and in a greedy maner, incorporates best possible support vector (on current iteration) to the dictionary, than proceeds to the next iteration. This process is terminated once dictionary has reached some pre-determined size. More detailed description of this simple, but efficient algorithm can be found in paper.

Usage

Let's figure out on how to use GSLS SVM in regression analysis.

  1. Given values X::Vector{Float64} of predictor and outcomes y::Vector{Float64} you have to prepare data to train GSLS SVM like this:
𝑿 = [[x] for x in X]
𝒚 = transpose(y)
  1. Then you have to choose number of support vectors sv_num::Int, regularization parameter γ::Float and kernel function kernel (construct it using higher-order functions kernel = kernel_RBF(σ) or kernel = kernel_polynomial(n, r)) and pass all this stuff to GSLS SVM algorithm like this:
dict_indices, 𝜷, b = GSLS_SVM(kernel, 𝑿, 𝒚, γ, sv_num)
  1. Finally, you have all you need to build the empirical estimation of the theoretical regression model:
f(x) = b + sum([𝜷[i] * kernel(𝑿[dict_indices[i]], [x])
                                        for i=1:length(dict_indices)])

About

Greedy Sparse Least-Squares SVM implementation (St. Petersburg Polytechnic State University)

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages