Perception for Autonomous Systems 31392:
Visual Odometry
Lecturer: —PhD
30 Mar. 2020 DTU Electrical Engineering 2
Copyright By PowCoder代写 加微信 powcoder
• Orientation (Attitude) Representations • Relative Pose Estimation
– 3D registration
– Least Squares – SVD
• Visual Odometry – 3D-3D
• Local Bundle Adjustment
• Visual Inertial Odometry -VIO
– Loosely Coupled EKF – Tightly Coupled EKF
30 Mar. 2020 DTU Electrical Engineering 3
What is Visual Odometry and what it is not
• Visual Odometry (VO) concerns the use of cameras to estimate the Pose (position and orientation) of a mobile system, by observing the apparent motion of the “static” world.
• VO assumes a static world where the only moving object is the mobile system
• VO does not provide a map of the environment neither it uses previous states of the world to improve it’s accuracy (SLAM)
Video by prof. Scaramuzza University of Zurich
30 Mar. 2020 DTU Electrical Engineering 4
• It has been proposed as an alternative to Wheel odometry
• It’s accuracy is –assuming reasonable trajectories- ~1%
• VO estimations are usually combined with other sensors
– GPS, IMU, Laser, Wheel odometry
– VINS, VIO stands for Visual Inertial Odometry
• VO has had some extraordinary usages
30 Mar. 2020 DTU Electrical Engineering 5
VO is neither SFM nor VSLAM
• Structure from motion tries to solve also for the feature points as well:
– “Given Calibrated point projections of p=1…N points in camera (or frame) f= 1 … F 𝑥𝑓,𝑦𝑓
– Find the rigid transformation RTt and the point’s 3D position F 𝑋 = 𝑋 , 𝑌 , 𝑍 which satisfies the 𝑝𝑝𝑝𝑝
projection equations
• Visual SLAM uses state estimation to exploit
additional constrains and re-observations of the same areas(Loop-closures) to optimize the localization
30 Mar. 2020 DTU Electrical Engineering 6
The 3 main variants of VO
30 Mar. 2020
DTU Electrical Engineering
Orientation
• Rotation Matrix
– Positives (The king of Orientation)
• Unique, no gimbal lock – Negatives:
• No perturbation, interpolation, unintuitive • Euler anglers
– Positives
• Minimal representation, intuitive
– Negatives
• Gimbal Lock, non commutative
• AxisAngle: – Positives
• No gimbal lock, minimal representation, nice for perturbation, linear mapping to rotation matrix
– Negative
• Not linear “scaling” wrt magnitude
– Exponential coordinates • Quaternions
– Positives
• all the axis angle ones, smooth trajectory
– Negatives
• No direct geometric representation
Shuster, M.D. (1993). “A Survey of Attitude Representations”. Journal of the Astronautical Sciences 41 (4): 439–517. Bibcode: 1993JAnSc..41..439S
30 Mar. 2020 DTU Electrical Engineering 8
Motion tracking
• Two main approaches – Feature based
– Optic Flow
– Feature based:
• Calculate feature on both images
• Match among the features
• Calculate features
• Do block Matching around our initial point (for small motion)
30 Mar. 2020 DTU Electrical Engineering 9
Motion tracking – Optical Flow
• Estimate the apparent motion
• Given a pixel in location I(x,y,t),
find the “nearby pixels with the same color“
– Same intensity (in the local window) – Limited displacement
– 𝐼 𝑥+𝑦,𝑡 =𝐼 𝑥+𝑢,𝑦+𝑣,𝑡+1
– 𝐼 𝑥 + 𝑢, 𝑦 + 𝑣 ≈ 𝐼 𝑥, 𝑦 + 𝜕𝐼 𝑢 + 𝜕𝐼 v → 𝑑𝑥 𝑑𝑦
𝐼 + ∇𝐼 ∙ 𝑢, 𝑣 = 0 → 𝐼 u + 𝐼 v + 𝐼 = 0 𝑡 𝑥𝑦𝑡
– This problem is the problem of “global Optical Flow” and is hard to solve due to under definition
30 Mar. 2020 DTU Electrical Engineering 10
Motion tracking – Optical Flow –
• The previous function is a line in the u,v space • 𝐼 u+𝐼 v+𝐼 = 0
• We can try to impose more constrains so the line becomes a point
• By assuming that for an image neighborhood we have constant “velocity”: We want to minimize:
30 Mar. 2020 DTU Electrical Engineering 11
Motion tracking – Optical Flow –
• If we use a 5×5 window,
that gives us 25 equations per pixel
• This does not work – For edges
– For large areas
• How to solve it:
The iterative approach
– Estimate Motion
– Repeat until no change
30 Mar. 2020 DTU Electrical Engineering 12
Relative Pose estimation – 2 corresponding 3D Point Clouds
• Depends on the available data:
– 3D or 2D
(Eg: Do we work on a monocular camera or a stereo one?)
• 3D registration – case where 𝒇𝒌 𝒂𝒏𝒅𝒇𝒌−𝟏 are in specified in 3D points – PCA
– SVD, RANSAC
– ICP, combination of above
What if we have correspondences?
– Rigid Transformation using RANSAC (3 points)
30 Mar. 2020 DTU Electrical Engineering 13
Relative Pose estimation – 3D “Point clouds” and 2D Image Points
• The problem where 𝒇𝒌−𝟏 is specified in 3D points and 𝒇𝒌 in 2D image coordinates – This problem is known as perspective from n points (PnP)
• ApopularimplementationistheP3P(perspectivefrom3points)
– Let P be the Center of Perspective
– A, B, C, the “control points” the 3D correspondence
– Appling the cosine law (e.g.: x2+z2- 2*x*z*cosβ =b’ )
we get 2 quadratic equations with 2 unknowns, resulting to 4 possible solutions for R,t
– Using in Ransac to find the correct, or employ a 4th point, check orientation consistency
• Manyotherimplementations:
– One of the most prominent is EPnP (n>=4)
• eformulate the problem with virtual “control points”
30 Mar. 2020 DTU Electrical Engineering 14
Relative Pose estimation – 2D Image Points
• The problem where 𝒇𝒌−𝟏 𝒂𝒏𝒅 𝒇𝒌 are specified in 2D image coordinates
• The minimal-case solution involves 5-point correspondences
• The solution is found by determining the transformation that minimizes the reprojection error of the corresponding points,
where 𝑝𝑖 is the points on image 𝑘 and g(𝑋𝑖, 𝐶 ) 𝑘 𝐾
the reprojection of the corresponding 𝑘 − 1 point on the camera 𝑘
Wait but WHY?
orresponding pipolar ines
30 Mar. 2020 DTU Electrical Engineering
Relative Pose estimation – 2D Image Points
orresponding pipolar ines
30 Mar. 2020 DTU Electrical Engineering
Relative Pose estimation – 2D Image Points
• The scale problem:
– The essential matrix is calculated up to scale, That means that the t of Rt is also up to scale: – How can we recover it?
We have to figure out a relative accurate movement between frames fk and fk-1 Any ideas?
30 Mar. 2020 DTU Electrical Engineering 17
Visual Odometry 3D – to 3D
30 Mar. 2020 DTU Electrical Engineering
Visual Odometry 3D – to 2D
30 Mar. 2020 DTU Electrical Engineering
Visual Odometry 2D – to 2D
30 Mar. 2020 DTU Electrical Engineering 20
Some Notes
• 2D-2D and 3D-32 are better than 3D-3D. Why? • Stereo VO even with 2D-2D is better. Why?
• Any ideas on improving VO?
30 Mar. 2020 DTU Electrical Engineering 21
Some Notes
• 2D-2D and 3D-32 are better than 3D-3D. Why? • Stereo VO even with 2D-2D is better. Why?
• Any ideas on improving VO?
30 Mar. 2020 DTU Electrical Engineering 22
What happens over time?
• VO accumulates the transformations Tk from frame fk-1 to frame fk over time providing the full trajectory
• What is the problem with that?
• How can we solve it?
• We can optimize over multiple frames in a procedure which is called bundle adjustment
𝑪𝟎 𝑪𝟏 𝑪𝟑 𝑪𝟒
30 Mar. 2020 DTU Electrical Engineering 23
Windowed Bundle Adjustment (BA)
• Similar to pose-optimization but it also optimizes 3D points
• In order to not get stuck in local minima, the initialization should be close the minimum • Levenberg-Marquadt can be used
30 Mar. 2020 DTU Electrical Engineering 24
Improving the Accuracy of VO
• Other sensors can be used such as – IMU (called inertial VO)
• An IMU combined with a single camera allows the estimation of the absolute scale. Why?
30 Mar. 2020 DTU Electrical Engineering 25
Visual Inertial Odometry VIO
“Visual-Inertial odometry (VIO) is the process of estimating the state (pose and velocity) of an agent (e.g., an aerial robot) by using only the input of one or more cameras plus one or more Inertial Measurement Units (IMUs) attached to it”
• Cameras are slow and information rich • IMUs are fast but high noise
• Two main paradigms of Filtering for State Estimation:
– Loosely coupled
– Tightly Coupled
(Depending on the integration of the visual info)
Scaramuzza, D. and Zhang, Z., 2019. Visual-Inertial Odometry of Aerial Robots. arXiv preprint arXiv:1906.03289.
30 Mar. 2020 DTU Electrical Engineering 26
Visual Inertial Odometry VIO
• Kalman state:
• , where 𝑇𝑖 is the 6-DoF pose of the IMU, 𝑣𝑖 is the velocity of the IMU, 𝑏𝑖 and 𝐵𝑖 are the biases of the
accelerometer and gyroscope respectively.
• a single moving camera allows us to measure the geometry of the 3D scene and the camera motion up to an unknown metric scale:
– the projection function satisfies project(p) = project(s· p) for an arbitrary scalar s and an arbitrary point p;
– a single IMU, instead, renders metric scale and gravity observable (due to the presence of gravity)
30 Mar. 2020 DTU Electrical Engineering 27
VIO, notable examples
• MSCKF, Multi-State Constraint (MSCKF) –tightly
• OKVIS (Leutenegger et al 2013, 2015) – Open Keyframe-based Visual-Inertial SLAM (OKVIS) –tightly
• Robust Visual Inertial Odometry (ROVIO) is a visual-inertial state estimator based on an extended – tightly (EKF)
• VINS-Mono –loosely
• SVO+MSF –loosely
30 Mar. 2020 DTU Electrical Engineering 28
• Visual Odometry – 3D-3D
• Local Bundle Adjustment
• Visual Inertial Odometry -VIO
– Loosely Coupled EKF – Tightly Coupled EKF
30 Mar. 2020 DTU Electrical Engineering 29
Perception for Autonomous Systems 31392:
Visual Odometry
Lecturer: —PhD
30 Mar. 2020 DTU Electrical Engineering 30
程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com