CS计算机代考程序代写 [06-30213][06-30241][06-25024]

[06-30213][06-30241][06-25024]
Computer Vision and Imaging &
Robot Vision
Dr Hyung Jin Chang Dr Yixing Gao
h.j.chang@bham.ac.uk y.gao.8@bham.ac.uk
School of Computer Science

FEATURE-BASED ALIGNMENT
(SZELISKI 6)
2

In this session
• Feature-based alignment – 2D transformations
– Affine fit – RANSAC
3

Motivation: Recognition
4 Figures from David Lowe

Slide credit: Kristen Grauman
Motivation: medical image registration
5

Motivation: mosaics
6 Image from http://graphics.cs.cmu.edu/courses/15-463/2010_fall/

Alignment problem
• We have previously considered how to fit a model to image evidence
– e.g., a line to edge points
• In alignment, we will fit the parameters of some
transformation according to a set of matching feature
pairs (“correspondences”). xi
xi’ T
Slide credit: Kristen Grauman
7

Parametric (global) warping Examples of parametric warps:
translation
rotation
aspect
affine
perspective
Source: Alyosha Efros
8

Parametric (global) warping
T
p = (x,y) p’ = (x’,y’) Transformation T is a coordinate-changing machine:
p’ = T(p)
What does it mean that T is global?
• Is the same for any point p
• can be described by just a few numbers (parameters)
Let’s represent T as a matrix M: p’ = Mp
éx’ù = Méxù êë y ‘ úû êë y úû
9
Source: Alyosha Efros

Scaling
Scaling a coordinate means multiplying each of its components by a scalar
Uniform scaling means this scalar is the same for all components:
́2
10
Source: Alyosha Efros

Scaling
Non-uniform scaling: different scalars per component:
X ́ 2, Y ́ 0.5
11
Source: Alyosha Efros

Scaling Scaling operation:
Or, in matrix form:
éx’ù = éa 0ùéxù êë y’úû êë0 búûêë yúû
scaling matrix S
x’= ax y’= by
12
Source: Alyosha Efros

What transformations can be represented with a 2×2 matrix?
2D Scaling?
x’=sx *x y’=sy *y
2D Rotate around (0,0)?
x’=cosQ*x-sinQ*y y’=sinQ*x+cosQ*y
2D Shear?
x’=x+sh*y x
y’=shy*x+y
éx’ù=ésx 0ùéxù êëy’úû êë0 syúûêëyúû
éx’ù=écosQ -sinQùéxù êëy’úû êësinQ cosQ úûêëyúû
éx’ù é1 shùéxù ê ú=ê xúê ú
ëy’û ëshy 1ûëyû
Source: Alyosha Efros
13

What transformations can be represented with a 2×2 matrix?
2D Mirror about Y axis?
x’= -x y’= y
2D Mirror over (0,0)?
x’= -x y ‘ = – y
2D Translation?
x’= x+tx y’= y+ty
éx’ù = é-1 êëy’úû êë 0
0ùéxù 1úûêëyúû
éx’ù = é-1
0 ùéxù – 1 úû êë y úû
êë y ‘ úû NO!
êë 0
14 Source: Alyosha Efros

2D Linear Transformations
éx’ù = éa bùéxù êë y ‘ úû êë c d úû êë y úû
Only linear 2D transformations can be represented with a 2×2 matrix.
Linear transformations are combinations of … • Scale,
• Rotation,
• Shear, and • Mirror
15
Source: Alyosha Efros

Homogeneous coordinates
Convenient coordinate system to represent many useful transformations
To convert to homogeneous coordinates:
homogeneous image coordinates
Converting from homogeneous coordinates:
Slide credit: Kristen Grauman
16

Homogeneous Coordinates
Q: How can we represent 2D translation as a 3×3 matrix using homogeneous coordinates?
x’= x+tx y’= y+ty
A: Using the rightmost column:
é1 0 txù Translation = ê0 1 t ú
êyú ë0 0 1û
17
Source: Alyosha Efros

Translation
Homogeneous Coordinates
éx’ù é1 0 txùéxù éx+txù êy’ú=ê0 1 t úêyú=êy+t ú
ê1ú ê0 0 1úê1ú ê 1 ú ëûë ûëûëû
yy
tx = 2 ty = 1
18
Source: Alyosha Efros

Basic 2D Transformations Basic 2D transformations as 3×3 matrices
éx’ù é1 0 tx ùéxù êy’ú = ê0 1 ty úêyú
ê1ú ê0 0 1úê1ú ë û ë ûë û
Translate
éx’ù écosQ -sinQ 0ùéxù êy’ú = êsinQ cosQ 0úêyú ê1úê0 0 1úê1ú
éx’ù ésx 0 0ùéxù êy’ú = ê 0 sy 0úêyú ê1ú ê0 0 1úê1ú
Scale
éx’ù é1 shx 0ùéxù êy’ú = êshy 1 0úêyú ê1ú ê0 0 1úê1ú
Shear
ë û ë
ûë û
ë û ë
ûë û
Rotate
ë û ë
ûë û
19
Source: Alyosha Efros

2D Affine Transformations
éx’ù éa b cùéxù êy’ú=êd e fúêyú
êw’ú ê0 0 1úêwú ë û ë ûë û
Affine transformations are combinations of … • Linear transformations, and
• Translations
Parallel lines remain parallel
Slide credit: Kristen Grauman
20

In this session
• Feature-based alignment – 2D transformations
– Affine fit
– RANSAC
21

Alignment problem
• We have previously considered how to fit a model to image evidence
– e.g., a line to edge points
• In alignment, we will fit the parameters of some
transformation according to a set of matching feature
pairs (“correspondences”).
xi
T
xi’
22 Kristen Grauman

Image alignment
• Twobroadapproaches:
– Direct (pixel-based) alignment
• Search for alignment where most pixels agree
– Feature-based alignment
• Search for alignment where extracted features agree • Can be verified using pixel-based alignment
23
Slide credit: Kristen Grauman

Fitting an affine transformation
• Assuming we know the correspondences, how do we
get the transformation? (xi , yi )
(x¢, y¢) ii
éx¢ù ém mùéxù étù ê iú=ê 1 2úê iú+ê1ú
y¢ m m y t
ë iû ë 3 4ûë iû ë2û
Slide credit: Kristen Grauman
24

An aside: Least Squares Example
Say we have a set of data points (X1,X1’), (X2,X2’), (X3,X3’), etc. (e.g. person’s height vs. weight)
We want a nice compact formula (a line) to predict X’s from Xs: Xa + b = X’
We want to find a and b
How many (X,X’) pairs do we need?
X 1 a + b = X 1′ é X 1 1 ù é a ù = é X 1′ ù Xa+b=X’ ê úêú êX’ú
Ax=B
X1b
2 2 ë2ûëûë2û
What if the data is noisy?
min Ax – B 2
é X 1 1 ù é X 1′ ù êX2 1úéaù êX’ ú
êX 1úêbú=ê 2ú 3 ëûX3′
ê … …ú ê … ú ëûëû
25
overconstrained
Source: Alyosha Efros

Fitting an affine transformation
• Assuming we know the correspondences, how do we get the transformation?
(xi , yi )
(x¢, y¢) ii
éx¢ù ém mùéxù étù ê iú=ê 1 2úê iú+ê1ú
! êm2ú! ii ê3ú=i
y¢ m m y t ëiûë3 4ûëiûë2û
ê0 0 x y 0 1úm êy¢ú
Slide credit: Kristen Grauman
êtú ë 2 û

êx y 0 0 1 0úêmú êx¢ú
ê ë
i i úê4úêiú
!t! ûê 1 ú26 ë û
ém ù
ùê 1ú é ù

Fitting an affine transformation
ém ù
é ùê 1ú é ù
! êm2ú! êx y 0 0 1 0úêmú êx¢ú
ii ê3ú=i ê0 0 x y 01úêmúêy¢ú
ê i i úê4úêiú !t!
ë ûê 1 ú ë û êtú
ë2û
• How many matches (correspondence pairs) do we
need to solve for the transformation parameters?
27
Kristen Grauman

Affine: # correspondences?
?
T(x,y)
y y’
x x’
How many correspondences needed for affine?
Alyosha Efros

Fitting an affine transformation
ém ù
é ùê 1ú é ù
! êm2ú! êx y 0 0 1 0úêmú êx¢ú
ii ê3ú=i ê0 0 x y 01úêmúêy¢ú
ê i i úê4úêiú !t!
ë ûê 1 ú ë û êtú
ë2û
• How many matches (correspondence pairs) do we
need to solve for the transformation parameters?
• Once we have solved for the parameters, how do we compute the coordinates of the corresponding point for (xnew,ynew) ?
• Where do the matches come from?
29
Kristen Grauman

What are the correspondences? ?
• Compare content in local patches, find best matches. e.g., simplest approach: scan with template, and compute SSD
or correlation between list of pixel intensities in the patch
• Later in the course: how to select regions using more robust descriptors.
30 Kristen Grauman

Fitting an affine transformation
Figures from David Lowe, ICCV 1999
31

Fitting an affine transformation
32

Fitting an affine transformation
33

Fitting an affine transformation
34 Figures from David Lowe, ICCV 1999

In this session
• Feature-based alignment – 2D transformations
– Affine fit
– RANSAC
35

Need to deal even better with outliers
• Largedisagreementsinonlyafewpoints(outliers)
cause failure of the least-squares-based methods.
• Thedetection,localizationandrecognitioninCV
have to operate in significantly noisy data.
• Insomecases>1⁄2dataisexpectedtobeoutliers.
• Standardmethodsforrobustestimationcanrarely deal with such a large proportion of outliers.
36

Outliers
• Outlierscanhurtthequalityofourparameter estimates, e.g.,
– an erroneous pair of matching points from two images
– an edge point that is noise, or doesn’t belong to the line we are fitting.
37 Kristen Grauman

Outliers affect least squares fit
Slide credit: Kristen Grauman
38

Outliers affect least squares fit
Slide credit: Kristen Grauman
39

RANSAC
• RANdomSampleConsensus
• Verypopularduetoitsgeneralityandsimplicity.
• Candealwithlargeportionsofoutliers.
• Publishedin1981(FischlerinBolles)
• OneofthemostcitedpapersinComputerVision (Cited by 25385)
• Manyimprovementsproposedsince!
M. A. Fischler, R. C. Bolles. Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image
Analysis and Automated Cartography. Comm. of the ACM, Vol 24, pp. 381-395, 1981. Slide credit: Kristen Grauman
40

RANSAC • RANdomSampleConsensus
• Approach:wewanttoavoidtheimpactofoutliers, so let’s look for “inliers”, and use those only.
• Intuition:ifanoutlierischosentocomputethe current fit, then the resulting line won’t have much support from rest of the points.
Slide credit: Kristen Grauman
41

RANSAC: Intuition by line fitting
• Agoodestimateofourmodelshouldhaveastrong support in data: “recognize a good model when you see it”
Allowed error
Allowed error
10 point support this line! 4 point support this line!
• Sohowcanwefindamodelwithastrongsupport? • Byrandomlysamplingpotentialmodels.

RANSAC: Intuition by line fitting
• Task:Robustlyestimatethemostlikelyline
Slide credit: Jinxiang Chai

RANSAC: Intuition by line fitting
• Task:Robustlyestimatethemostlikelyline
Randomly choose a pair of points
(Note: the smallest number of points to fit a line is two)
Slide credit: Jinxiang Chai

RANSAC: Intuition by line fitting
• Task:Robustlyestimatethemostlikelyline
Fit the line to the selected points.
Slide credit: Jinxiang Chai

RANSAC: Intuition by line fitting
• Task:Robustlyestimatethemostlikelyline
The inliers are all points whose error is lower than some prescribed value.
𝜀! =𝑓 𝑥!;𝒑 −𝑦!
Slide credit: Jinxiang Chai

RANSAC: Intuition by line fitting
• Task:Robustlyestimatethemostlikelyline
Repeat N-iterations, or, until the support becomes strong enough (actually this is oversimplification).
Slide credit: Jinxiang Chai

RANSAC: General form • RANSAC loop:
1. Randomly select a seed group of points on which to base transformation estimate
2. Compute transformation from seed group
3. Find inliers to this transformation
4. If the number of inliers is sufficiently large, re-compute estimate of transformation on all of the inliers
• Keep the transformation with the largest number of inliers
Slide credit: Kristen Grauman
48

RANSAC for line fitting example
Source: R. Raguram
Lana Lazebnik
49

RANSAC for line fitting example
Source: R. Raguram
Lana Lazebnik
Least-squares fit
50

RANSAC for line fitting example
Source: R. Raguram
Lana Lazebnik
1. Randomly select minimal subset of points
51

RANSAC for line fitting example
Source: R. Raguram
Lana Lazebnik
1. Randomly select minimal subset of points
2. Hypothesize a model
52

RANSAC for line fitting example
Source: R. Raguram
Lana Lazebnik
1. Randomly select minimal subset of points
2. Hypothesize a model
3. Compute error function
53

RANSAC for line fitting example
Source: R. Raguram
Lana Lazebnik
1. Randomly select minimal subset of points
2. Hypothesize a model
3. Compute error function
4. Select points consistent with model
54

RANSAC for line fitting example
Source: R. Raguram
Lana Lazebnik
1. Randomly select minimal subset of points
2. Hypothesize a model
3. Compute error function
4. Select points consistent with model
5. Repeat
hypothesize-and- verify loop
55

RANSAC for line fitting example
Source: R. Raguram
Lana Lazebnik
1. Randomly select minimal subset of points
2. Hypothesize a model
3. Compute error function
4. Select points consistent with model
5. Repeat
hypothesize-and- verify loop
56

RANSAC for line fitting example
Source: R. Raguram
Lana Lazebnik
1. Randomly select minimal subset of points
2. Hypothesize a model
3. Compute error function
4. Select points consistent with model
5. Repeat
hypothesize-and- verify loop
57

RANSAC for line fitting example
Source: R. Raguram
Lana Lazebnik
1. Randomly select minimal subset of points
2. Hypothesize a model
3. Compute error function
4. Select points consistent with model
5. Repeat
hypothesize-and- verify loop
58

RANSAC for line fitting
Repeat N times:
• Draw s points uniformly at random
• Fit line to these s points
• Find inliers to this line among the remaining points (i.e., points whose distance from the line is less than t)
• If there are d or more inliers, accept the line and refit using all inliers
59
Lana Lazebnik

RANSAC: line fitting Another example

A general setting
1. Define the set of “potentially” corresponding points:
𝒙! !”#:% , 𝒙&! !”#:%
2. Define the transformation model: 𝑓 𝒙; 𝒑 : 𝒙 → 𝒙&

A simple RANSAC loop
𝒙” “#$:& , 𝒙!” “#$:& 𝑓 𝒙; 𝒑 : 𝒙 → 𝒙!
1. Randomly select the smallest group of correspondences, from which we can estimate the parameters of our model.
2. Fit the parametric model to the selected correspondences.
3. Count how many of all correspondences are in agreement
with the fitted model – number of inliers.
• Remember the model parameters that maximize the number of inliers.

After RANSAC: Refit by LS
• RANSACsplitsthedataintoinliersandoutliers,and calculates the model parameters using a minimal number of correspondences.
• Improvethemodelparametersbyapplyingleast squares to the inliers.
Slide credit: David Lowe

Beyond the simple RANSAC
• Agreatdealofresearchwasinvestedbymany researchers into improving RANSAC
• Particularlyintofindtherightsolutionfaster
• Ormakingitbetterresilienttooutliers
• Pleaseseeonlineresources:
• MLESAC(usesmaximumlikelihoodonhypothesisverification)
• PROSAC(betterchoosesorderofcorrespondences)
• Oranentirepresentationdedicatedtorecentdevelopments on RANSAC:
J. Matas RANSAC in 2011 – 30 years after, CVPR, 2011

Fitting: Challenges
• If we know the inliers how to estimate the parameters?
• Least squares
• What if our data includes outliers?
• Robust least squares, RANSAC
• What if we have multiple instances of our model (e.g., multiple lines)?
• Apply voting: sequential RANSAC, Hough transform
• What if we have multiple models (e.g., unknown degree of a polynomial)?
• Apply model selection (e.g., MDL, BIC, AIC)
• Complicatednonparametricmodels
• Generalized Hough (GHT)
• Iterative Closest Point, (ICP) == iterative local least squares

RANSAC pros and cons
• Pros
• Simple and general
• Applicable to many different problems
• Often works well in practice
• Cons
• Lots of parameters to tune
• Doesn’t work well for low inlier ratios (too many iterations, or can fail completely)
• Can’t always get a good initialization of the model based on the minimum number of samples
69
Lana Lazebnik

Coming up: alignment and image stitching
70

Thank you
Hyung Jin Chang
28/02/2021