image categorization - university of illinois urbana-champaign
TRANSCRIPT
![Page 1: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/1.jpg)
Image Categorization
Computer Vision
CS 543 / ECE 549
University of Illinois
Derek Hoiem
03/11/10
![Page 2: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/2.jpg)
Last classes
• Object recognition: localizing an object instance in an image
• Face recognition: matching one face image to another
![Page 3: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/3.jpg)
Today’s class: categorization
• Overview of image categorization
• Representation– Image histograms
• Classification– Important concepts in machine learning
– What the classifiers are and when to use them
![Page 4: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/4.jpg)
Image Categorization
Training Labels
Training Images
Classifier Training
Training
Image Features
Trained Classifier
![Page 5: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/5.jpg)
Image Categorization
Training Labels
Training Images
Classifier Training
Training
Image Features
Image Features
Testing
Test Image
Trained Classifier
Trained Classifier Outdoor
Prediction
![Page 6: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/6.jpg)
Part 1: Image features
Training Labels
Training Images
Classifier Training
Training
Image Features
Trained Classifier
![Page 7: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/7.jpg)
General Principles of Representation• Coverage
– Ensure that all relevant info is captured
• Concision– Minimize number of features without sacrificing coverage
• Directness– Ideal features are independently useful for prediction
Image Intensity
![Page 8: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/8.jpg)
Space Shuttle Cargo Bay
Image Representations: Histograms
Global histogram• Represent distribution of features
– Color, texture, depth, …
Images from Dave Kauchak
![Page 9: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/9.jpg)
Image Representations: Histograms
• Joint histogram– Requires lots of data– Loss of resolution to
avoid empty bins
Images from Dave Kauchak
Marginal histogram• Requires independent features• More data/bin than
joint histogram
Histogram: Probability or count of data in each bin
![Page 10: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/10.jpg)
EASE Truss Assembly
Space Shuttle Cargo Bay
Image Representations: Histograms
Images from Dave Kauchak
Clustering
Use the same cluster centers for all images
![Page 11: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/11.jpg)
Computing histogram distance
Chi-squared Histogram matching distance
∑= +
−=
K
m ji
jiji mhmh
mhmhhh
1
22
)()()]()([
21),(χ
( )∑=
−=K
mjiji mhmhhh
1)(),(min1),histint(
Histogram intersection (assuming normalized histograms)
Cars found by color histogram matching using chi-squared
![Page 12: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/12.jpg)
Histograms: Implementation issues
Few BinsNeed less dataCoarser representation
Many BinsNeed more dataFiner representation
• Quantization– Grids: fast but only applicable with few dimensions– Clustering: slower but can quantize data in higher dimensions
• Matching– Histogram intersection or Euclidean may be faster– Chi‐squared often works better– Earth mover’s distance is good for when nearby bins represent similar values
![Page 13: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/13.jpg)
What kind of things do we compute histograms of?• Color
• Texture (filter banks or HOG over regions)L*a*b* color space HSV color space
![Page 14: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/14.jpg)
What kind of things do we compute histograms of?• Histograms of gradient
• Visual words
SIFT – Lowe IJCV 2004
![Page 15: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/15.jpg)
Image Categorization: Bag of WordsTraining1. Extract keypoints and descriptors for all training images
2. Cluster descriptors
3. Quantize descriptors using cluster centers to get “visual words”
4. Represent each image by normalized counts of “visual words”
5. Train classifier on labeled examples using histogram values as features
Testing1. Extract keypoints/descriptors and quantize into visual words
2. Compute visual word histogram
3. Compute label or confidence using classifier
![Page 16: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/16.jpg)
But what about layout?
All of these images have the same color histogram
![Page 17: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/17.jpg)
Spatial pyramid
Compute histogram in each spatial bin
![Page 18: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/18.jpg)
Part 2: Classifiers
Training Labels
Training Images
Classifier Training
Training
Image Features
Trained Classifier
![Page 19: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/19.jpg)
Learning a classifier
• Given some set features with corresponding labels, learn a function to predict the labels from the features
x x
x x
x
xx
x
oo
o
o
o
x2
x1
![Page 20: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/20.jpg)
Many classifiers to choose from
• SVM
• Neural networks
• Naïve Bayes
• Bayesian network
• Logistic regression
• Randomized Forests
• Boosted Decision Trees
• K‐nearest neighbor
• RBMs
• Etc.
Which is the best one?
![Page 21: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/21.jpg)
No Free Lunch Theorem
![Page 22: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/22.jpg)
Bias‐Variance Trade‐off
MSE = bias2 + variance
![Page 23: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/23.jpg)
Bias and Variance
Many training examples
Few training examples
Complexity Low BiasHigh Variance
High BiasLow Variance
Test
Erro
r
Error = bias2 + variance
![Page 24: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/24.jpg)
Choosing the trade‐off
• Need validation set
• Validation set not same as test set
Training error
Test error
Complexity Low BiasHigh Variance
High BiasLow Variance
Err
or
![Page 25: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/25.jpg)
Effect of Training Size
Testing
TrainingNumber of Training Examples
Err
or
Generalization Error
Fixed classifier
![Page 26: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/26.jpg)
How to measure complexity?
• VC dimension
Training error +
Upper bound on generalization error
N: size of training seth: VC dimensionη: 1-probability
![Page 27: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/27.jpg)
How to reduce variance?
• Choose a simpler classifier
• Regularize the parameters
• Get more training data
![Page 28: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/28.jpg)
The perfect classification algorithm
• Objective function: solves what you want to solve
• Parameterization: makes assumptions that fit the problem
• Regularization: right level of regularization for amount of training data
• Training algorithm: can find parameters that maximize objective on training set
• Inference algorithm: can solve for objective function in evaluation
![Page 29: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/29.jpg)
Generative vs. Discriminative Classifiers
Generative• Training
– Maximize joint likelihood of data and labels
– Assume (or learn) probability distribution and dependency structure
– Can impose priors
• Testing– P(y=1, x) / P(y=0, x) > t?
• Examples– Foreground/background GMM
– Naïve Bayes classifier
– Bayesian network
Discriminative• Training
– Learn to directly predict the labels from the data
– Assume form of boundary
– Margin maximization or parameter regularization
• Testing– f(x) > t ; e.g., wTx > t
• Examples– Logistic regression
– SVM
– Boosted decision trees
![Page 30: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/30.jpg)
Generative Classifier: Naïve Bayes• Objective
• Parameterization
• Regularization
• Training
• Inference
x1 x2 x3
y
![Page 31: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/31.jpg)
Using Naïve Bayes
• Simple thing to try for categorical data
• Very fast to train/test
![Page 32: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/32.jpg)
Classifiers: Logistic Regression• Objective
• Parameterization
• Regularization
• Training
• Inference
x x
x x
x
xx
x
oo
o
o
o
x2
x1
![Page 33: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/33.jpg)
Using Logistic Regression
• Quick, simple classifier (try it first)
• Use L2 or L1 regularization– L1 does feature selection and is robust to irrelevant features
![Page 34: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/34.jpg)
Classifiers: Linear SVM
• Objective
• Parameterization
• Regularization
• Training
• Inference
x x
x x
x
xx
x
oo
o
o
o
x2
x1
![Page 35: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/35.jpg)
Classifiers: Linear SVM
• Objective
• Parameterization
• Regularization
• Training
• Inference
x x
x x
x
xx
x
oo
o
o
o
x2
x1
![Page 36: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/36.jpg)
Classifiers: Linear SVM
• Objective
• Parameterization
• Regularization
• Training
• Inference
x x
x x
x
xx
xo
oo
o
o
o
x2
x1
![Page 37: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/37.jpg)
Classifiers: Kernelized SVM
• Objective
• Parameterization
• Regularization
• Training
• Inference
xx xx oo ox
x
x
x
x
o
oo
x
x2
![Page 38: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/38.jpg)
Using SVMs• Good general purpose classifier
– Generalization depends on margin, so works well with many weak features
– No feature selection– Usually requires some parameter tuning
• Choosing kernel– Linear: fast training/testing – start here– RBF: related to neural networks, nearest neighbor– Chi‐squared, histogram intersection: good for histograms (but slower, esp. chi‐squared)
– Can learn a kernel function
![Page 39: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/39.jpg)
Classifiers: Decision Trees
• Objective
• Parameterization
• Regularization
• Training
• Inference
x x
x x
x
xx
xo
oo
o
o
oo
x2
x1
![Page 40: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/40.jpg)
Ensemble Methods: Boosting
figure from Friedman et al. 2000
![Page 41: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/41.jpg)
Boosted Decision Trees
…
Gray?
High inImage?
Many LongLines?
Yes
No
NoNo
No
Yes Yes
Yes
Very High Vanishing
Point?
High in Image?
Smooth? Green?
Blue?
Yes
No
NoNo
No
Yes Yes
Yes
Ground Vertical Sky
[Collins et al. 2002]
P(label | good segment, data)
![Page 42: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/42.jpg)
Using Boosted Decision Trees
• Flexible: can deal with both continuous and categorical variables
• How to control bias/variance trade‐off– Size of trees
– Number of trees
• Boosting trees often works best with a small number of well‐designed features
• Boosting “stubs” can give a fast classifier
![Page 43: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/43.jpg)
K‐nearest neighbor
x x
x x
x
xx
xo
oo
o
o
oo
x2
x1
• Objective
• Parameterization
• Regularization
• Training
• Inference
+
+
![Page 44: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/44.jpg)
1‐nearest neighbor
x x
x x
x
xx
xo
oo
o
o
oo
x2
x1
+
+
![Page 45: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/45.jpg)
3‐nearest neighbor
x x
x x
x
xx
xo
oo
o
o
oo
x2
x1
+
+
![Page 46: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/46.jpg)
5‐nearest neighbor
x x
x x
x
xx
xo
oo
o
o
oo
x2
x1
+
+
![Page 47: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/47.jpg)
Using K‐NN
• Simple, so another good one to try first
• With infinite examples, 1‐NN provably has error that is at most twice Bayes optimal error
![Page 48: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/48.jpg)
Clustering (unsupervised)
x x
x xx
xo
o
o
o
o
x1
x
x2
+ +
+ +
+
++
+
+
+
+
x2
x1
+
![Page 49: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/49.jpg)
What to remember about classifiers
• No free lunch: machine learning algorithms are tools, not dogmas
• Try simple classifiers first
• Better to have smart features and simple classifiers than simple features and smart classifiers
• Use increasingly powerful classifiers with more training data (bias‐variance tradeoff)
![Page 50: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/50.jpg)
Next class
• Object category detection overview
![Page 51: Image Categorization - University of Illinois Urbana-Champaign](https://reader034.vdocuments.mx/reader034/viewer/2022052601/628cd4edd2f9ce27587c14af/html5/thumbnails/51.jpg)
Some Machine Learning References
• General– Tom Mitchell, Machine Learning, McGraw Hill, 1997
– Christopher Bishop, Neural Networks for Pattern Recognition, Oxford University Press, 1995
• Adaboost– Friedman, Hastie, and Tibshirani, “Additive logistic regression: a statistical view of boosting”, Annals of Statistics, 2000
• SVMs– http://www.support‐vector.net/icml‐tutorial.pdf