1. C ONTRIBUTIONSThe key contributions of our work are:• Analysis of the spatial receptive field (RF) designs forpooled features.• Evidence that spatial pyramids may be suboptimal infeature generation.• An algorithm that jointly learns adaptive RF andthe classifiers, with an efficient implementation usingover-completeness and structured sparsity.2. T HE P IPELINEState-of-the-art classification algorithms take a two-layer pipeline: the coding layer learns activations from localimage patches, and the pooling layer aggregates activations in multiple spatial regions. Linear classifiers are learnedfrom the pooled features.3. N EUROSCIENCE I NSPIRATION4. S PATIAL P OOLING R EVISITED• Much work has been done on the coding part, whilethe spatial pooling methods are often hand-crafted.• Sample performances on CIFAR-10 with different re-ceptive field designs:Note the suboptimality of SPM - random selectionfrom an overcomplete set of spatially pooled featuresconsistently outperforms SPM.• We propose to learn the spatial receptive fields as wellas the codes and the classifier.5. N OTATIONS• I: image input.• A1 , · · · , AK : code activation as matrices, with Akij : ac-tivation of code k at position (i, j).• Ri : RF of the i-th pooled feature.• op(·): pooling operator, such as max(·).• f (x, θ): the classifier based on pooled features x.• A pooled feature xi is defined by choosing a code in-dexed by ci and a spatial RF Ri :The vector of pooled features x is then determinedby the set of parameters C = {c1 , · · · , cM } and R ={R1 , · · · , RM }.6. T HE L EARNING P ROBLEMN{(In , yn )}n=1 ,• Given a set of training datawe jointlylearn the classifier and the pooled features as (assum-ing that coding is done in an unsupervised way):• Advantage: pooled features are tailored towards theclassification task (also reduces redundancy).• Disadvantage: may be intractable - an exponentialnumber of possible receptive fields.• Solution: reasonably overcomplete receptive fieldcandidates + sparsity constraints to control the num-ber of final features.7. O VERCOMPLETE RF• We propose to use overcomplete receptive field can-didates based on regular grids: (a) Base (b) SPM (c) Ours• The structured sparsity regularization is adopted toselect only a subset of features for classification:8. G REEDY F EATURE S ELECTION• Directly perform optimization is still time and mem-ory consuming.• Following [Perkins JMLR03], We adopted an incre-mental, greedy approach to select features based ontheir scores:• After each increment, the model is retrained only withrespect to an active subset of selected features to en-sure fast re-training:• Benefit of overcompleteness in spatial pooling + fea-ture selection: higher performance with smaller code-books and lower feature dimensions.9. R ESULTS• Performance comparison on CIFAR-10 with state-of-the-art approaches:• Result on MNIST and the 1-vs-1 saliency map ob-tained from our algorithm:10. R EFERENCES• A Coates and AY Ng. The importance of encodingversus training with sparse coding and vector quanti-zation. ICML 2011.• S Perkins, K Lacker, and J Theiler. Grafting: fast, incre-mental feature selection by gradient descent in func-tion space. JMLR, 3:1333–1356, 2003.• DH Hubel and TN Wiesel. Receptive fields, binocu-lar interaction and functional architecture in the cat’svisual cortex. J. of Physiology, 160(1):106–154, 1962.