CS计算机代考程序代写 Variational Methods: A

Variational Methods: A
Short Intro

Prof. Daniel Cremers

Variational Methods

Variational Image
Smoothing

Euler-Lagrange
Equation

Gradient Descent

Adaptive Smoothing

Euler and Lagrange

updated April 12, 2021 1/13

Chapter 9
Variational Methods: A Short Intro
Multiple View Geometry
Summer 2021

Prof. Daniel Cremers
Chair for Computer Vision and Artificial Intelligence

Departments of Informatics & Mathematics
Technical University of Munich

Variational Methods: A
Short Intro

Prof. Daniel Cremers

Variational Methods

Variational Image
Smoothing

Euler-Lagrange
Equation

Gradient Descent

Adaptive Smoothing

Euler and Lagrange

updated April 12, 2021 2/13

Overview

1 Variational Methods

2 Variational Image Smoothing

3 Euler-Lagrange Equation

4 Gradient Descent

5 Adaptive Smoothing

6 Euler and Lagrange

Variational Methods: A
Short Intro

Prof. Daniel Cremers

Variational Methods

Variational Image
Smoothing

Euler-Lagrange
Equation

Gradient Descent

Adaptive Smoothing

Euler and Lagrange

updated April 12, 2021 3/13

Variational Methods

Variational methods are a class of optimization methods. They
are popular because they allow to solve many problems in a
mathematically transparent manner. Instead of implementing a
heuristic sequence of processing steps (as was commonly
done in the 1980’s), one clarifies beforehand what properties
an ’optimal’ solution should have.

Variational methods are particularly popular for
infinite-dimensional problems and spatially continuous
representations.

Particular applications are:
• Image denoising and image restoration
• Image segmentation
• Motion estimation and optical flow
• Spatially dense multiple view reconstruction
• Tracking

Variational Methods: A
Short Intro

Prof. Daniel Cremers

Variational Methods

Variational Image
Smoothing

Euler-Lagrange
Equation

Gradient Descent

Adaptive Smoothing

Euler and Lagrange

updated April 12, 2021 4/13

Advantages of Variational Methods

Variational methods have many advantages over heuristic
multi-step approaches (such as the Canny edge detector):

• A mathematical analysis of the considered cost function
allows to make statements on the existence and
uniqueness of solutions.

• Approaches with multiple processing steps are difficult to
modify. All steps rely on the input from a previous step.
Exchanging one module by another typically requires to
re-engineer the entire processing pipeline.

• Variational methods make all modeling assumptions
transparent, there are no hidden assumptions.

• Variational methods typically have fewer tuning
parameters. In addition, the effect of respective
parameters is clear.

• Variational methods are easily fused – one simply adds
respective energies / cost functions.

Variational Methods: A
Short Intro

Prof. Daniel Cremers

Variational Methods

Variational Image
Smoothing

Euler-Lagrange
Equation

Gradient Descent

Adaptive Smoothing

Euler and Lagrange

updated April 12, 2021 5/13

Example: Variational Image Smoothing

Let f : Ω→ R be a grayvalue input image on the domain
Ω ⊂ R2. We assume that the observed image arises by some
’true’ image corrupted by additive noise. We are interested in a
denoised version u of the input image f .

The approximation u should fulfill two properties:
• It should be as similar as possible to f .
• It should be spatially smooth (i.e. ’noise-free’).

Both of these criteria can be entered in a cost function of the
form

E(u) = Edata(u, f ) + Esmoothness(u)

The first term measures the similarity of f and u. The second
one measures the smoothness of the (hypothetical) function u.

Most variational approaches have the above form. They merely
differ in the specific form of the data (similarity) term and the
regularity (or smoothness) term.

Variational Methods: A
Short Intro

Prof. Daniel Cremers

Variational Methods

Variational Image
Smoothing

Euler-Lagrange
Equation

Gradient Descent

Adaptive Smoothing

Euler and Lagrange

updated April 12, 2021 6/13

Example: Variational Image Smoothing
For denoising a grayvalue image f : Ω ⊂ R2 → R, specific
examples of data and smoothness term are:

Edata(u, f ) =

(
u(x)− f (x)

)2
dx ,

and
Esmoothness(u) =


|∇u(x)|2 dx ,

where ∇ = (∂/∂x , ∂/∂y)> denotes the spatial gradient.

Minimizing the weighted sum of data and smoothness term

E(u) =
∫ (

u(x)− f (x)
)2

dx + λ

|∇u(x)|2 dx , λ > 0,

leads to a smooth approximation u : Ω→ R of the input image.

Such energies which assign a real value to a function are
called a functionals. How does one minimize functionals where
the argument is a function u(x) (rather than a finite number of
parameters)?

Variational Methods: A
Short Intro

Prof. Daniel Cremers

Variational Methods

Variational Image
Smoothing

Euler-Lagrange
Equation

Gradient Descent

Adaptive Smoothing

Euler and Lagrange

updated April 12, 2021 7/13

Functional Minimization & Euler-Lagrange Equation

• As a necessary condition for minimizers of a functional the
associated Euler-Lagrange equation must hold. For a
functional of the form

E(u) =

L(u,u′) dx ,

it is given by

dE
du

=
∂L
∂u

d
dx

∂L
∂u′

= 0

• The central idea of variational methods is therefore to
determine solutions of the Euler-Lagrange equation of a
given functional. For general non-convex functionals this is
a difficult problem.

• Another solution is to start with an (appropriate) function
u0(x) and to modify it step by step such that in each
iteration the value of the functional is decreased. Such
methods are called descent methods.

Variational Methods: A
Short Intro

Prof. Daniel Cremers

Variational Methods

Variational Image
Smoothing

Euler-Lagrange
Equation

Gradient Descent

Adaptive Smoothing

Euler and Lagrange

updated April 12, 2021 8/13

Gradient Descent

One specific descent method is called gradient descent or
steepest descent. The key idea is to start from an initialization
u(x , t = 0) and iteratively march in direction of the negative
energy gradient.

For the class of functionals considered above, the gradient
descent is given by the following partial differential equation:


u(x ,0) = u0(x)

∂u(x , t)
∂t

= −
dE
du

= −
∂L
∂u

+
d
dx

∂L
∂u′

.

Specifically for L(u,u′) = 12
(
u(x)− f (x)

)2
+ λ2 |u

′(x)|2 this
means:

∂u
∂t

= (f − u) + λu′′.

If the gradient descent evolution converges: ∂u/∂t = − dEdu = 0,
then we have found a solution for the Euler-Lagrange equation.

Variational Methods: A
Short Intro

Prof. Daniel Cremers

Variational Methods

Variational Image
Smoothing

Euler-Lagrange
Equation

Gradient Descent

Adaptive Smoothing

Euler and Lagrange

updated April 12, 2021 9/13

Image Smoothing by Gradient Descent

E(u) =

(f − u)2dx + λ

|∇u|2 dx → min.

E(u) =

|∇u|2 dx → min.

Author: D. Cremers

Variational Methods: A
Short Intro

Prof. Daniel Cremers

Variational Methods

Variational Image
Smoothing

Euler-Lagrange
Equation

Gradient Descent

Adaptive Smoothing

Euler and Lagrange

updated April 12, 2021 10/13

Discontinuity-preserving Smoothing

E(u) =

|∇u|2 dx → min.

E(u) =

|∇u|dx → min.

Author: D. Cremers

Variational Methods: A
Short Intro

Prof. Daniel Cremers

Variational Methods

Variational Image
Smoothing

Euler-Lagrange
Equation

Gradient Descent

Adaptive Smoothing

Euler and Lagrange

updated April 12, 2021 11/13

Discontinuity-preserving Smoothing

Variational Methods: A
Short Intro

Prof. Daniel Cremers

Variational Methods

Variational Image
Smoothing

Euler-Lagrange
Equation

Gradient Descent

Adaptive Smoothing

Euler and Lagrange

updated April 12, 2021 12/13

Leonhard Euler

Leonhard Euler (1707 – 1783)

• Published 886 papers and books, most of these in the last
20 years of his life. He is generally considered the most
influential mathematician of the 18th century.

• Contributions: Euler number, Euler angle, Euler formula,
Euler theorem, Euler equations (for liquids),
Euler-Lagrange equations,…

• 13 children

Variational Methods: A
Short Intro

Prof. Daniel Cremers

Variational Methods

Variational Image
Smoothing

Euler-Lagrange
Equation

Gradient Descent

Adaptive Smoothing

Euler and Lagrange

updated April 12, 2021 13/13

Joseph-Louis Lagrange

Joseph-Louis Lagrange (1736 – 1813)

• born Giuseppe Lodovico Lagrangia (in Turin). Autodidact.
• At the age of 19: Chair for mathematics in Turin.
• Later worked in Berlin (1766-1787) and Paris (1787-1813).
• 1788: La Méchanique Analytique.
• 1800: Leçons sur le calcul des fonctions.

Variational Methods
Variational Image Smoothing
Euler-Lagrange Equation
Gradient Descent
Adaptive Smoothing
Euler and Lagrange

anm0:
0.23:
0.22:
0.21:
0.20:
0.19:
0.18:
0.17:
0.16:
0.15:
0.14:
0.13:
0.12:
0.11:
0.10:
0.9:
0.8:
0.7:
0.6:
0.5:
0.4:
0.3:
0.2:
0.1:
0.0: