1. C ONTRIBUTIONS The key contributions of our work are: • Analysis of the spatial receptive field (RF) designs for pooled features. • Evidence that spatial pyramids may be suboptimal in feature generation. • An algorithm that jointly learns adaptive RF and the classifiers, with an efficient implementation using over-completeness and structured sparsity. 2. T HE P IPELINE
State-of-the-art classification algorithms take a two-layer pipeline: the coding layer learns activations from local image patches, and the pooling layer aggregates activations in multiple spatial regions. Linear classifiers are learned from the pooled features. 3. N EUROSCIENCE I NSPIRATION
4. S PATIAL P OOLING R EVISITED • Much work has been done on the coding part, while the spatial pooling methods are often hand-crafted. • Sample performances on CIFAR-10 with different re- ceptive field designs:
Note the suboptimality of SPM - random selection from an overcomplete set of spatially pooled features consistently outperforms SPM. • We propose to learn the spatial receptive fields as well as the codes and the classifier. 5. N OTATIONS • I: image input. • A1 , · · · , AK : code activation as matrices, with Akij : ac- tivation of code k at position (i, j). • Ri : RF of the i-th pooled feature. • op(·): pooling operator, such as max(·). • f (x, θ): the classifier based on pooled features x. • A pooled feature xi is defined by choosing a code in- dexed by ci and a spatial RF Ri : The vector of pooled features x is then determined by the set of parameters C = {c1 , · · · , cM } and R = {R1 , · · · , RM }. 6. T HE L EARNING P ROBLEM N{(In , yn )}n=1 ,• Given a set of training datawe jointly learn the classifier and the pooled features as (assum- ing that coding is done in an unsupervised way): • Advantage: pooled features are tailored towards the classification task (also reduces redundancy). • Disadvantage: may be intractable - an exponential number of possible receptive fields. • Solution: reasonably overcomplete receptive field candidates + sparsity constraints to control the num- ber of final features. 7. O VERCOMPLETE RF • We propose to use overcomplete receptive field can- didates based on regular grids:
(a) Base
(b) SPM
(c) Ours • The structured sparsity regularization is adopted to select only a subset of features for classification: 8. G REEDY F EATURE S ELECTION • Directly perform optimization is still time and mem- ory consuming. • Following [Perkins JMLR03], We adopted an incre- mental, greedy approach to select features based on their scores: • After each increment, the model is retrained only with respect to an active subset of selected features to en- sure fast re-training:
• Benefit of overcompleteness in spatial pooling + fea- ture selection: higher performance with smaller code- books and lower feature dimensions.
9. R ESULTS • Performance comparison on CIFAR-10 with state-of- the-art approaches:
• Result on MNIST and the 1-vs-1 saliency map ob- tained from our algorithm:
10. R EFERENCES • A Coates and AY Ng. The importance of encoding versus training with sparse coding and vector quanti- zation. ICML 2011. • S Perkins, K Lacker, and J Theiler. Grafting: fast, incre- mental feature selection by gradient descent in func- tion space. JMLR, 3:1333–1356, 2003. • DH Hubel and TN Wiesel. Receptive fields, binocu- lar interaction and functional architecture in the cat’s visual cortex. J. of Physiology, 160(1):106–154, 1962.