Skip to content

Examples

Domingo Mery edited this page Dec 10, 2018 · 25 revisions

1. Applications

Fully automated design of a computer vision system

In this example, we show how to read images from a directory, label the images in classes, extract features, select features, select a classifier and evaluate the performance in only 12 lines! see Balu Code. See methodology in this paper. Bfx_gui Bcl_gui

See how can be used this 10-line code to design automatically a computer vision system to detect faces. In the dataset, there are 60 faces and 200 no-faces. Using only LBP features, Balu is able to select 15 features and a classifier with a performance of 95% validated with cross-validation (warning: the 95% is achieved in this data set, no warranty for other data sets) see Balu Code

Automated detection using a tracking algorithm in multiple X-ray views

In this example, we show how to detect pen tips in pencil cases using a tracking algorithm see Balu Code. Details of the method are published in this paper and video. If you want to use the graphic user interface shown in this video, execute the command Btr_gui and load the file pencase.mat.

2. Image Processing

Segmentation of an object in a homogeneous background

Segmentation.png With only this command Bio_segshow('testimg1.jpg') you can obtain this figure. In addition, you can test any segmentation algorithm using simple commands like the followings bellow. Details of the method are published in this paper.

I = imread('testimg2.jpg');

Bio_segshow(I,'Bim_segpca');

or

R = Bim_segpca(I);

imshow(R)

Segmentation using clustering

Segmentation.png

Try this command to segment in this image the sky, the clouds and the palm:

I = imread('testimg9.jpg');

Bim_segkmeans(I,3,1,1);

Segmentation of details

With only the following commands you can segment the well-known rice image of Matlab:

I = imread('rice.png');

figure;imshow(I)

[F,m] = Bim_segmowgli(I,[],40,1.5);

figure;imshow(F,[])

Segmentation using sliding windows

In this example, you can see how welding defects can be detected using sliding windows. The example selects 100 'no-defects' and 100 'defects' from a training image (where the ideal segmentation is apriori known). Afterward, LBP features are extracted and an LDA classifier is trained using SFS selected features. Finally, the trained classifier is used to segment a test image see Balu Code. Details of the method are published in this paper.

Interactive segmentation

Sometimes you need to separate certain objects of an image, and no segmentation approach works fine. Try the function Bim_regiongrow with this image, it is easy for example to separate the pen tips using this tool.

Lab conversion

If you want to calibrate your computer vision system in order to process Lab color images (see our paper), first you have to estimate the parameters of the model M that converts from RGB to Lab* using the function Bim_labparam, and second you can convert an RGB image X using the command:

Y = Bim_rgb2lab(X,M);

if you don't want to calibrate the computer vision system you can use the theoretical conversion implemented in the function Bim_rgb2lab0.

3. Feature Extraction

How to separate T & Y

ExampleClassificationTY.png In this example we show how to separate the characters 'T' and 'Y' using the eccentricity of the segmented regions. We use the train image to establish the threshold automatically and the test image to evaluate it see Balu Code.

Hu moments

In this example, we show how to use Hu moments to obtain a good separability in recognition of characters '1', '2' and '3' in this image. see Balu Code.

How to build feature extraction functions in Balu

In this example, we show how to build a function that computes the centroid of a binary region see Balu Code.

Arrows recognition

In this example, we show how to separate three types of arrows using simple Balu commands. The classification is performed thresholding only one feature. See Training image, Test Image, and Balu Code.

Ellipses

ExampleEllipses.png

In this example, we show how to fit a binary region to an ellipse. In example 1, we the best ellipses fitted to binary regions of this image. In example 2, we detect elliptical objects orientated to a determined angle see Balu Code).

4. Feature Selection

Feature selection with Balu algorithm

ExampleSFS.png

This example shows how to use Bfs_balu algorithm. This algorithm has three steps: (1) normalizes (using Bft_norm), (2) cleans (using Bfs_clean), and (3) selects features (using Bfs_sfs). In this example, the objective function that is maximized by SFS algorithm is the performance of SVM-RBF classifier see Balu Code.

Feature selection with LSEF

LSEF select feature subsets based on their capacities to reproduce sample projections on principal axes. It can be used to estimate an approximation of PCA using a linear projection of some original features see Balu Code.

Feature selection with exhaustive search

In this example, we preselect 10 features using SFS with Fisher criterium, and afterward, we select 4 from them using exhaustive search. In this example, the objective function that will be maximized by SFS algorithm is the performance of a KNN classifier with 5 neighbors see Balu Code

Comparison of feature selection algorithms

In this example, we show how to test several feature selection algorithms and their combinations see Balu Code.

5. Classification

In Balu, there are some definitions:

  • X is the training data (one sample per row). It is a matrix of Nxm elements, N samples, and each sample has m features.

  • d is the ideal classification of X. It is a vector of N x 1 elements. For example, d(i) is 1 if sample i belongs to class 1.

  • Xt is the testing data defined as X. It is a matrix of Nt x m elements, Nt samples, and each sample has m features.

  • dt is the ideal classification of Xt. It is a vector of Nt x 1 elements. dt is never used to train a classifier, it is used to evaluate the classification of Xt.

  • ds is the real classification of Xt, i.e., the prediction using the trained classifier. It is a vector of Nt x 1 elements. ds has to be compared with dt to evaluate the performance of the classification of Xt. If the prediction is perfect, ds(i) is equal to dt(i) for every testing sample i = 1, ... Nt.

  • op are the options of the classifier. For example, for KNN with 3 neighbors, we define op.k=3.

In Balu, the classifiers are implemented as functions Bcl_name, where name is the name of the classifier. See the following example that shows Linear Discriminant Analysis using Bcl_lda.

Example with LDA

With simple code lines, you are able to train and test a classifier. ExampleLDA.png

LDA: Training & Test together

load datagauss % simulated data (2 classes, 2 features)

Bio_plotfeatures(X,d) % plot feature space

op.p = [0.5 0.5]; % a priori probabilities

ds = Bcl_lda(X,d,Xt,op); % LDA classifier

p = Bev_performance(ds,dt) % performance on test data

LDA: Only Training

load datagauss % simulated data (2 classes, 2 features)

Bio_plotfeatures(X,d) % plot feature space

op.p = [0.5 0.5]; % a priori probabilities

op = Bcl_lda(X,d,op); % LDA classifier

LDA: Only Testing (after training)

ds = Bcl_lda(Xt,op); % LDA classifier

p = Bev_performance(ds,dt) % performance on test data

The same code can be used with Bcl_qda, Bcl_knn, ... you only have to define correctly options variable op. Each classifier has an example, e.g., for KNN you only have to type help Bcl_knn to see how to see a very simple example that shows you how to define the options.

Decision lines for classifiers of two features

In this example, we show how can you plot decision lines of different classifiers see Balu Code. DecisionLines.jpg

6. Evaluation

Cross-validation

With Balu it is very easy to evaluate many classifiers on the same data. Take a look of this code to see how to compute 10-fold cross-validation performance of 9 classifiers (this example is given by typing help Bev_crossval) ExampleCrossVal.png