Skip to content

Latest commit

 

History

History
45 lines (31 loc) · 2.76 KB

README.md

File metadata and controls

45 lines (31 loc) · 2.76 KB

Max-margin Deep Conditional Generative Models for Semi-Supervised Learning

Chongxuan Li, Jun Zhu and Bo Zhang

Full paper, a journal version of our NIPS15 paper (original paper and code). A novel class-condional variants of mmDGMs is proposed.

Summary of Max-margin Deep Conditional Generative Models (mmDCGMs)

  • We boost the effectiveness and efficiency of DGMs in semi-supervised learning by
    • Employing advanced CNNs as the x2y, xy2z and zy2x networks
    • Approximating the posterior inference of labels
    • Proposing powerful max-margin discriminative losses for labeled and unlabeled data
  • and the arrived mmDCGMs can
    • Perform efficient inference: constant time with respect to the number of classes
    • Achieve state-of-the-art classification results on sevarl benchmarks: MNIST, SVHN and NORB with 1000 labels and MNIST with full labels
    • Disentangle classes and styles on raw images without preprocessing like PCA given small amount of labels

Some libs we used in our experiments

Python Numpy Scipy Theano Lasagne Parmesan

State-of-the-art results on MNIST, SVHN and NORB datasets with 1000 labels and excellent results competitive to best CNNS given all labels on MNIST

chmod +x *.sh

./cdgm-svhn-ssl_1000.sh gpu0 (Run .sh files to obtain corresponding results)

For small norb dataset, please download the raw images in .MAT format from http://www.cs.nyu.edu/~ylclab/data/norb-v1.0-small/ and run datasets_norb.convert_orig_to_np() to convert it into numpy format.

See Table 6 and Table 7 in the paper for the classfication results.

Class conditional generation of raw images given a few labels

Results on MNIST given 100 labels (left: 100 labeled data sorted by class, right: samples, where each row shares same class and each column shares same style.)

Results on SVHN given 1000 labels

Results on small NORB given 1000 labels