1. OverviewGoal: Obtain high quality image annotation with low cost (annotation effort)! low quality annotation high quality annotationApproach: Bayesian active learning!Minimize uncertainty in the boundary of MAP prediction !Tradeoff uncertainty reduction and cost of annotation !Contributions!Entropy bounds that measure the expected perturbation that change MAP prediction.!Coarse to fine approach for pixel-accurate annotation that saves 33% in cost.!2. Active learning in structured spacesTraditional Active learning!Active learner picks which data points to label. Typically assume data is i.i.d.!Bayesian active learning in structured spaces!Deals with correlated labels, e.g. labels of a single image (non i.i.d. setting)!Basic idea: Construct a probability function over the label space and reduce itsuncertainty with minimal annotation cost (clicks)!3. Active annotation frameworkApproach!be the set of labels for image x for n pixels!Let,Let,be the set of annotations obtained till time tLet, p(y) be the joint probability of the labels given the data x and annotations till time t!Bayesian experimental design!Given: !a function that measures the uncertainty of the labels given the annotation, U(A) !a function that measures the cost of annotation, C(a)!Pick the annotation task the provides the highest uncertainty reduction/unit cost, i.e.,:!Uncertainty, U(A) = H (p), is defined as the entropy!Computing entropy is exponential in the size of the patch. for many useful cases,•however MAP estimation is tractable for some of these (e.g., via Graph-cuts, MPLP)!4. Markov Random Fields (MRFs) for image labelingPopular for image segmentation (e.g. Grabcut model, Blake et al., 2004) !Let an annotation of an n pixel image be described as a n-tuple!The overall score of the pixel label is given by:!The MAP estimate can be obtained via. Graph cuts (Boykov et al., 2001)The5. MAP perturbationsThe Perturb MAX model (Papandreou and Yuille, 2011, Tarlow 2012, Gane 2014)!Random functionsMAP perturbations upper bound the partition function (Hazan & Jaakkola 2012)Let { i (yi )} be i.i.d. Gumbel random variables with zero mean6. Measuring uncertainty in the boundary of MAP predictionFor Perturb MAX models with Gumbel random variables!Where,!Proof idea: !Conjugate duality:Use MAP perturb. upper bounds.!The optimal theta attains the perturb-max model p(y).The linear term cancels out.!Uncertainty measure!Nonnegative (upper bounds the entropy).!Attains its minimal value for the zero-one distribution (zero mean perturbations).!Attains its maximal value for the uniform distribution (symmetry).7. Active boundary annotationCoarse-to-fine boundary refinement!We start from a coarse boundary and repeatedly the!regions are picked by the algorithm, refinement is done by the user!Cost of refinement = number of points in the polygons (boundary complexity)!We don’t know the truth, so we can compute expectations of cost and uncertainty!8. Experimental evaluationAn example coarse-to-fine refinement (sampled regions for various strategies)!Active annotation results!9. Conclusions and future workWe proposed a new uncertainty measure!Avoids expensive MCMC sampling by randomly perturbing the model and using a MAPsolver as a black box tool.Applications for parameter estimation and active learning in a number of areas such asmatchings, parse trees, and other combinatorial structures.!Active learning in structured spaces!Sampling based approach allows us to consider non-decomposable cost functions. Forthe boundary annotation task we used boundary complexity, which is not possible tocompute with marginal estimates.!This led to 33% savings in annotation time for pixel-accurate boundary annotations.!Challenges!MAP perturbation based entropy bounds for higher dimensional perturbations.!Beyond super-modular functions in the context of active learning.!