PowerPoint 프레젠테이션
Changjae Oh
Copyright By PowCoder代写 加微信 powcoder
Computer Vision
– Restoration: spatial filtering –
Semester 1, 22/23
• Neighborhood in an image
• Convolution: review
̶ From ‘Signal Processing’ Lecture
• Spatial Filtering
̶ Low-pass (or high-pass) filter
̶ Gaussian (or Laplacian) filter
̶ Mean, median, or mode filter
Neighborhood in an image
• Spatial filtering is performed for each pixel (𝑖, 𝑗) using neighborhoods of (𝑖, 𝑗).
̶ 𝐼(𝑖, 𝑗): input image 𝑂(𝑖, 𝑗): output image
̶ 𝑤(𝑠, 𝑡): filtering kernel (a.k.a. mask)
• The filtering operation is defined according to what kind of 𝑤(𝑠, 𝑡) is used.
• The filter serves as an essential building block for many applications.
̶ Blurring, sharpening, image restoration, and so on
𝑂 𝑖, 𝑗 =
𝑤 𝑠, 𝑡 𝐼(𝑖 + 𝑠, 𝑗 + 𝑡)
Spatial Filtering: Filter Kernel (Mask)
• Filter kernel should be defined accordingly, depending on applications.
𝑂 𝑖, 𝑗 =
𝑤 𝑠, 𝑡 𝐼(𝑖 + 𝑠, 𝑗 + 𝑡)
(𝑖, 𝑗) (𝑖, 𝑗)
Filtering = SUM (Mask .× Neighborhood)
.×𝑂 𝑖, 𝑗 =
𝐼(𝑖 − 1, 𝑗 − 2) 𝐼(𝑖 − 1, 𝑗 − 1) 𝐼(𝑖 − 1, 𝑗) 𝐼(𝑖 − 1, 𝑗 + 1) 𝐼(𝑖 − 1, 𝑗 + 2)
𝐼(𝑖, 𝑗 − 2) 𝐼(𝑖, 𝑗 − 1) 𝐼(𝑖, 𝑗) 𝐼(𝑖, 𝑗 + 1) 𝐼(𝑖, 𝑗 + 2)
𝐼(𝑖 + 1, 𝑗 − 2) 𝐼(𝑖 + 1, 𝑗 − 1) 𝐼(𝑖 + 1, 𝑗) 𝐼(𝑖 + 1, 𝑗 + 1) 𝐼(𝑖 + 1, 𝑗 + 2)
𝑤(−1,−2) 𝑤(−1,−1) 𝑤(−1,0) 𝑤(−1,1) 𝑤(−1, 2)
𝑤(0,−2) 𝑤(0,−1) 𝑤(0,0) 𝑤(0,1) 𝑤(0, 2)
𝑤(1,−2) 𝑤(1,−1) 𝑤(1,0) 𝑤(1,1) 𝑤(1, 2)
Filtering = SUM (Mask .× Neighborhood)
Convolution: From the perspective of signal processing
• Filtering = Convolution!
̶ When the filter kernel is symmetric
• Let’s start with 1-D convolution (from signal processing)
̶ Convolution integral
𝑦 𝑡 = 𝑥 𝑡 ∗ ℎ 𝑡 = න
𝑥 𝜏 ℎ 𝑡 − 𝜏 𝑑𝜏
𝑦[𝑛] = 𝑥[𝑛] ∗ ℎ[𝑛] =
𝑥 𝑚 ℎ[𝑛 − 𝑚]
𝑦 𝑛 = ⋯+ 𝑥 −1 h 𝑛 + 1 + 𝑥 0 ℎ 𝑛 + 𝑥 1 ℎ 𝑛 − 1 +⋯
For continuous signal
For discrete signal
Convolution: From the perspective of signal processing
• From the perspective of signal processing
̶ An output signal 𝑦[𝑛] is obtained by passing an input signal 𝑥[𝑛] to a discrete system
with a response function ℎ[𝑛].
𝑦[𝑛] = 𝑥[𝑛] ∗ ℎ[𝑛] =
𝑥 𝑚 ℎ[𝑛 − 𝑚]
ℎ[𝑛]𝑥[𝑛] 𝑦[𝑛]
Note convolution in spatial domain = multiplication in frequency domain
Convolution Properties
• Associative property
̶ Impulse response of a cascade connection
= Convolution of the individual impulse responses
• Distributive property
̶ Impulse response of a parallel connection of LTI systems
= Sum of the individual impulse responses.
• The convolution on the continuous domain
For illustration, let an input signal x(t) and the
impulse response h(t) be the two functions below.
𝑦 𝑡 = 𝑥 𝑡 ∗ ℎ 𝑡 = න
𝑥 𝜏 ℎ 𝑡 − 𝜏 𝑑𝜏
• The convolution on the continuous domain
̶ The functional transformation from ℎ(𝑡) to ℎ(𝑡 − 𝜏)
( ) ( ) ( )( ) ( )h h h ht t t →− → −⎯⎯⎯→ − ⎯⎯⎯→ − − = −
𝑦 𝑡 = 𝑥 𝑡 ∗ ℎ 𝑡 = න
𝑥 𝜏 ℎ 𝑡 − 𝜏 𝑑𝜏
• The convolution on the continuous domain
The convolution value at 𝑡:
The area under the product of 𝑥(𝑡) and ℎ(𝑡 − 𝜏).
For example, let 𝑡 = 5.
For 𝑡 = 5, the area under the product is zero.
𝑦 5 = 𝑥 5 ∗ ℎ 5
𝑥 𝜏 ℎ 5 − 𝜏 𝑑𝜏 = 0
• The convolution on the continuous domain
When 𝑡 = 0,
𝑦 0 = 𝑥 0 ∗ ℎ 0 = න
𝑥 𝜏 ℎ −𝜏 𝑑𝜏 = 2
• The convolution on the continuous domain
The process of convolving to find 𝑦(𝑡) is illustrated below.
• The convolution on the discrete domain
What is 𝑦[𝑛]?
𝑦[𝑛] = 𝑥[𝑛] ∗ ℎ[𝑛] =
𝑥 𝑚 ℎ[𝑛 − 𝑚]
Note) 𝑥: 𝑀 points, ℎ: 𝑁 points
→ 𝑦: 𝑀 +𝑁 − 1 points
Convolution vs. Filtering
𝑦[𝑖] = 𝑥[𝑖] ∗ ℎ[𝑖]
1D convolution 1D filtering
𝑤[𝑠]𝐼 𝑖 + 𝑠
𝑂 𝑖, 𝑗 =
𝑤 𝑠, 𝑡 𝐼(𝑖 + 𝑠, 𝑗 + 𝑡)
ℎ[𝑠]𝑥 𝑖 − 𝑠
Note) This formulation is different from
previous slides, but the result is identical.
Namely, 𝑥 ∗ ℎ = ℎ ∗ 𝑥.
Remember 2D filtering
Suppose ℎ[𝑖] exists for −𝑎 ≤ 𝑖 ≤ 𝑎 and is symmetric, i.e., ℎ −𝑖 = ℎ[𝑖].
Then, 1D convolution = 1D filtering!
This also applies to 𝑁-D convolution and 𝑁-D filtering (𝑁 ≥ 2).
2D convolution – example
Source from D. Lowe
2D convolution – example
2D convolution – example
Source from D. Lowe
2D convolution – example
Source from D. Lowe
2D convolution – example
Source from D. Lowe
2D convolution – example
Source from D. Lowe
2D convolution – example
Source from D. Lowe
2D convolution – example
• Removing Blurred region means:
• Adding it back means:
Source from L. Fei-Fei
2D convolution – example
• Deep learning: Convolutional Neural Network
Credit: https://ujjwalkarn.me/2016/08/11/intuitive-explanation-convnets/
2D Image Filtering: Uniform Mean Filter
• Uniform mean filter
̶ The simplest low-pass filter
Filter kernel
EBU6018- remember?
2D Image Filtering: Uniform Mean Filter
• How to process pixels at image boundary?
̶ 1) Mirroring at an image boundary
𝑂 𝑖, 𝑗 =
𝑤 𝑠, 𝑡 𝐼′(𝑖 + 1 + 𝑠, 𝑗 + 1 + 𝑡)
𝐼′: mirror image
2D Image Filtering: Uniform Mean Filter
• How to process pixels at image boundary?
̶ 2) Zero padding at an image boundary
̶ Common way in convolutional neural networks
𝑂 𝑖, 𝑗 =
𝑤 𝑠, 𝑡 𝐼′(𝑖 + 1 + 𝑠, 𝑗 + 1 + 𝑡)
𝐼′: zero-padded image
2D Image Filtering: Uniform Mean Filter
• How to process pixels at image boundary?
̶ 3) Adjusting filter kernel
1 𝑤′ 𝑠, 𝑡 𝐼(𝑖 + 𝑠, 𝑗 + 𝑡)
= (0 ≤ 𝑖 + 𝑠 ≤ 𝐻 − 1 & 0 ≤ 𝑗 + 𝑡 ≤ 𝑊 − 1 ? 𝑤 𝑠, 𝑡 : 0)
2D Image Filtering: Uniform Mean Filter
(a) original image, (b) 3 × 3 filter with zero padding:
, (c) 9 × 9 filter with zero padding:
,(d) 25 × 25
filter with zero padding:
, (e) 25 × 25 filter with mirroring:
2D Image Filtering: Uniform Mean Filter
• Uniform mean filtering is separable.
̶ Separable filtering is VERY important in terms of runtime.
Connection
1 1 1 = 𝑤𝑠 𝑠, 0 ∗ 𝑤𝑠(0, 𝑡)
2D Image Filtering: Gaussian Filter
• Gaussian distribution
̶ One of the most commonly used parametric models
̶ 𝑭𝒐𝒖𝒓𝒊𝒆𝒓 𝒕𝒓𝒂𝒏𝒔𝒇𝒐𝒓𝒎 𝐺𝑎𝑢𝑠𝑠𝑖𝑎𝑛 𝑓𝑢𝑛𝑐. = 𝐺𝑎𝑢𝑠𝑠𝑖𝑎𝑛 𝑓𝑢𝑛𝑐.
̶ Rotationally symmetric
̶ Separable filter
̶ 𝐺𝑎𝑢𝑠𝑠𝑖𝑎𝑛 ∗ 𝐺𝑎𝑢𝑠𝑠𝑖𝑎𝑛 = 𝐺𝑎𝑢𝑠𝑠𝑖𝑎𝑛
|2𝜋|𝑁/2 Σ 1/2
exp −(𝒎− 𝝁)𝑇Σ−1(𝒎− 𝝁)
Q: What if 𝜎 approaches infinite?
2D Image Filtering: Gaussian Filter
• Gaussian filter’s advantage
̶ It considers spatial distances within neighborhoods
̶ Blurred results looks more natural compared to mean filter.
In results, 𝜇𝑠 = 𝜇𝑡 = 0 and 𝜎𝑠 = 𝜎𝑡 = 𝜎
Zero mean Gaussian filter
𝑂 𝑖, 𝑗 =
𝑤 𝑠, 𝑡 𝐼(𝑖 + 𝑠, 𝑗 + 𝑡)
Practical implementation
2D Image Filtering: Gaussian Filter
• Gaussian filter’s advantage
̶ Separable
𝑂 𝑖, 𝑗 =
𝑤 𝑠, 𝑡 𝐼(𝑖 + 𝑠, 𝑗 + 𝑡)
𝑂 𝑖, 𝑗 =
𝑤𝑡(𝑡) 𝐼(𝑖 + 𝑠, 𝑗 + 𝑡)
Practical implementation
Frequency: Low- and High-pass Filters
• Frequency in an image
̶ How often do intensity values vary in neighbourhoods?
• Low-pass filter: uniform average filter, Gaussian filter
• High-pass filter: Sobel filter, Laplacian filter
(this estimates intensity change.)
Example of Low-pass filter:
Example of High-pass filter:
2D Image Filtering:
• Using the first order gradient
𝛻𝐼 = 𝐼𝑥 , 𝐼𝑦 =
Practical implementation
𝛻𝐼 = 𝐼𝑥 , 𝐼𝑦 = 𝑆𝑥 ∗ 𝐼 , |𝑆𝑦 ∗ 𝐼|
𝑀 𝑥, 𝑦 = 𝐼𝑥
or 𝐼𝑥 + |𝐼𝑦|Sobel filter output
For color image,
𝑀 𝑥, 𝑦 = 𝑀𝑅 𝑥, 𝑦 + 𝑀𝐺 𝑥, 𝑦 + 𝑀𝐵 𝑥, 𝑦 /3
2D Image Filtering: Laplacian Filter
• Using the second order gradient
• 1st derivative of an image 𝐼(𝑥, 𝑦)
• 2nd derivative of an image 𝐼(𝑥, 𝑦)
= 𝐼 𝑥 + 1, 𝑦 − 𝐼(𝑥, 𝑦)
= 𝐼𝑥 𝑥, 𝑦 − 𝐼𝑥 𝑥 − 1, 𝑦 = 𝐼 𝑥 + 1, 𝑦 + 𝐼 𝑥 − 1, 𝑦 − 2𝐼(𝑥, 𝑦)
= 𝐼 𝑥, 𝑦 + 1 − 𝐼(𝑥, 𝑦)
= 𝐼𝑦 𝑥, 𝑦 − 𝐼𝑦 𝑥, 𝑦 − 1 = 𝐼 𝑥, 𝑦 + 1 + 𝐼 𝑥, 𝑦 − 1 − 2𝐼(𝑥, 𝑦)
𝛻2𝐼 𝑥, 𝑦 =
Laplacian Filtering output 𝐺 𝑥, 𝑦 =
2D Image Filtering: Laplacian Filter
• Using the second order gradient
More general form
Laplacian filter 𝐿
For color image,
𝑂 𝑥, 𝑦 = 𝑂𝑅 𝑥, 𝑦 + 𝑂𝐺 𝑥, 𝑦 + 𝑂𝐵 𝑥, 𝑦 /3
Unsharp Masking
• Makes an image look sharper by boosting high-frequency components.
Unsharp mask
Unsharp Masking
𝑂𝑢𝑡𝑝𝑢𝑡 = (𝐼 − 𝑘𝐿)/(1 − 𝑘)
Scaling matters!
x = double(imread(‘cameraman.tif’))/255;
f = fspecial(‘average’);
xf = filter2(f,x);
figure, imshow(xf)
% k: parameter determining the amount of reducing low-frequency component
% As k increases, the output image looks sharper. But, it is recommended using k below 0.5;
fi = zeros(3); fi(2,2)=1;
f2 = (fi – k*f)/(1-k);
xf2 = filter2(f2,x);
figure, imshow(xf2)
Matlab Code
For color image, perform the operation for
RGB channels independently.
Unsharp Masking
1/9 1/9 1/9
1/9 1/9 1/9
1/9 1/9 1/9
1/9 1/9 1/9
1/9 1/9 1/9
1/9 1/9 1/9
𝑂𝑢𝑡𝑝𝑢𝑡 = (𝐼 − 𝑘𝐿)/(1 − 𝑘)
2D Image Filtering: Nonlinear Filter
• Linear vs. Nonlinear filters?
̶ Definition of linearity, and its advantages
• Some instances of nonlinear filter
̶ Max, Min, Median filter
Max filter Min filter Median filterOriginal
2D Image Filtering: Nonlinear Filter
• Median Filter
1. Sort pixels within the window centered at reference pixel
2. Select the median value as an output.
Q: Mean vs. Median?
Max filter Min filter Median filterOriginal
• Spatial Filtering
̶ Depends on how to define a filter kernel.
̶ Linear vs. Nonlinear filter: Linear filter is separable!
̶ Mean, median, maximum, minimum, low-pass, high-pass, Gaussian, Laplacian filter
̶ Edge Sharpening: combination of low-pass and high-pass filters.
• Convolution
̶ Same as filter
̶ Commutativity, Cascade property, Parallel property
Changjae Oh
Computer Vision
– Restoration: noise and noise removal –
Semester 1, 22/23
• Image Degradation Model
̶ Image Noise
• Noise removal
̶ Salt-and-Pepper Noise Removal
̶ Gaussian Noise Removal
̶ Periodic Noise Removal
Image Degradation Model
• Image restoration
̶ Aims to reduce the image degradation
• Types of image degradation
̶ Noise, our-of-focus blur, motion blur
Image Degradation Model
• When an input degraded image 𝑔(𝑥, 𝑦) is given, our goal is to estimate 𝑓(𝑥, 𝑦).
• 𝑛(𝑥, 𝑦): Additive noise
• ℎ 𝑥, 𝑦 : Blur kernel, which is the same as filtering mask.
• In frequency domain (using 2D DFT)?
𝑔 𝑥, 𝑦 = ℎ 𝑥, 𝑦 ∗ 𝑓 𝑥, 𝑦 + 𝑛(𝑥, 𝑦)
𝐺 𝑢, 𝑣 = 𝐻 𝑢, 𝑣 𝐹 𝑢, 𝑣 + 𝑁(𝑢, 𝑣)
(𝑥, 𝑦): 2D image coordinate
(𝑢, 𝑣): 2D frequency coord.𝐹 𝑢, 𝑣 = 𝐺 𝑢, 𝑣 − 𝑁 𝑢, 𝑣 /𝐻 𝑢, 𝑣
Is that it? It is not such a simple problem.
1) 𝑁(𝑢, 𝑣) is not known.
2) What if 𝐻(𝑢, 𝑣) = 0?
Image Noise
̶ Any kind of degradation in an image caused by external disturbance
̶ To make the problem simple, we pre-assume the noise models.
The most appropriate restoration may vary depending on the noise types.
• Noise types
̶ Periodic noise, Salt and Pepper noise, Gaussian noise, Speckle noise
• In the beginning, let’s think of the case using ℎ(𝑥,𝑦)=1
𝑔 𝑥, 𝑦 = 𝑓 𝑥, 𝑦 + 𝑛(𝑥, 𝑦)
Image Noise: Salt and Pepper Noise
• Sharp and sudden disturbances
• Image is randomly scattered as white (salt) or black (pepper) pixels.
Image Noise: Gaussian Noise
• Additive White Gaussian Noise (AWGN)
̶ Additive noise: 𝐼𝐺(𝑥, 𝑦) = 𝐼(𝑥, 𝑦) + 𝑁(𝑥, 𝑦)
̶ White noise: randomly fluctuated and normally distributed
̶ Most approaches assume this type of noise.
̶ Usually, zero mean AWGN is assumed (𝜇 = 0).
The probability density function (PDF) 𝑃 of a Gaussian random variable 𝑧 is
𝑃(𝑧 = 𝑁) =
Image Noise: Speckle Noise
• Multiplicative Noise
̶ 𝐼1(𝑥, 𝑦) = 𝐼(𝑥, 𝑦) + 𝐼(𝑥, 𝑦)𝑁(𝑥, 𝑦)
̶ 𝑁(𝑥, 𝑦): zero mean uniform distributed function with 𝜎
̶ This noise is usually in the active radar, synthetic aperture radar (SAR), medical
ultrasound and optical coherence tomography images.
̶ The reduction of Speckle noise is MUCH more difficult due to the multiplication
Image Noise: Gaussian Noise vs. Speckle Noise
Gaussian Noise Speckle Noise
Image Noise: Periodic Noise
• Periodic fluctuation
̶ Electrical or electromechanical interference during image acquisition
̶ Spatially dependent noise
̶ It can be modeled as sinusoid waves
• Image Degradation Model
̶ Image Noise
• Noise removal
̶ Salt-and-Pepper Noise Removal
̶ Gaussian Noise Removal
̶ Periodic Noise Removal
Salt and Pepper Noise Removal
• Low-Pass Filtering
̶ For instance, uniform averaging filter or Gaussian averaging filter
̶ Not so effective
• Median Filtering
̶ Median filter works well in the salt-and-pepper noise removal.
̶ Rank-order Filtering is a general formulation of median filter.
• Outlier Rejection Method
Salt and Pepper Noise Removal
• Low-Pass Filtering
̶ NOT effective in removing the salt and pepper noise.
10% salt and pepper noise
(salt: 5%, pepper: 5%)
Original Image 3×3 average filter 7×7 average filter
Salt and Pepper Noise Removal: Median Filtering
• Median filtering procedure
1. Sort pixels within the window centered at reference pixel
2. Select the median value as an output.
̶ Good performance in the salt-and-pepper noise removal
3 × 3 median filter
Salt and Pepper Noise Removal: Median Filtering
• More results using median filter
20% salt and pepper noise
(salt: 10%, pepper: 10%)
Result using 3 × 3 median
Result applying 3 × 3
median filter twice
Result applying 5 × 5
median filter
Salt and Pepper Noise Removal: Outlier Rejection Method
• The brute force implementation of median filtering is very slow.
̶ Different method was proposed to remove the salt-and-pepper noise efficiently.
• Key idea: Outlier detection → rejection
̶ Outliers usually tend to be different from neighboring pixels’ intensities.
• Outlier rejection method
1. Choose a threshold value 𝐷
2. For a given pixel, compare its value 𝑝 with the mean 𝑚 of the values of its eight neighbors
3. If |𝑝 − 𝑚| > 𝐷, then classify the pixel as noisy, otherwise not.
4. If the pixel is noisy, replace its value with m.
Salt and Pepper Noise Removal: Outlier Rejection Method
Gaussian Noise Removal
• Why does an image average filter work well for Gaussian Noise removal?
• Additive White Gaussian Noise (AWGN) 𝑁(𝑥, 𝑦) is added as below.
̶ 𝐼𝐺(𝑥, 𝑦) = 𝐼(𝑥, 𝑦) + 𝑁(𝑥, 𝑦)
1. Suppose we have 100 noisy images 𝐼𝐺.
𝑖 (𝑥, 𝑦) = 𝐼(𝑥, 𝑦) + 𝑁𝑖(𝑥, 𝑦) 𝑖 = 1,… , 100
2. Average 100 noisy images
𝑖 (𝑥, 𝑦) = 𝐼(𝑥, 𝑦) +
This will be close to 0, as AWGN 𝑁 is a
zero-mean Gaussian PDF.
10×10 kernel
Gaussian Noise Removal
• Why does an image average filter work well for Gaussian Noise removal?
Image averaging to remove Gaussian noise. (a) 10 images (b) 100 images
Gaussian Noise Removal: Simple Average Filtering
• Simple Average Filtering: Uniform mean filter or Gaussian filter
̶ Using a small window: not so effective in noise removal.
̶ Using a large window: effective in noise removal, but the output is over-smoothed.
Uniform mean filter to remove Gaussian noise. (a) 3×3 averaging (b) 5×5 averaging
Gaussian Noise Removal: Bilateral Filtering
• Bilateral filter for grayscale image
̶ One of the most popular filters with various applications
̶ Considers both spatial and intensity distances
̶ This can be rewritten as:
𝐼 𝑖, 𝑗 − 𝐼(𝑖 + 𝑠, 𝑗 + 𝑡) 2
𝑂 𝑖, 𝑗 =
𝑤 𝑠, 𝑡 𝐼(𝑖 + 𝑠, 𝑗 + 𝑡)
𝑊 𝑖, 𝑗 =
𝐼 𝑖, 𝑗 − 𝐼(𝑖 + 𝑚, 𝑗 + 𝑛) 2
𝒑 = (𝑖, 𝑗)
𝒒 = (𝑖 + 𝑠, 𝑗 + 𝑡)
𝐺𝜎𝑠 𝒑 − 𝒒 𝐺𝜎𝑟 𝐼𝒑 − 𝐼𝒒 𝐼𝒒
𝐺𝜎𝑠 𝒑 − 𝒒 𝐺𝜎𝑟 𝐼𝒑 − 𝐼𝒒
Gaussian Noise Removal: Bilateral Filtering
• Bilateral filter for color image
̶ Applying filter to each channel
𝐺𝜎𝑠 𝒑 − 𝒒 𝐺𝜎𝑟 𝐶𝒑 − 𝐶𝒒 𝑅𝒒
𝐺𝜎𝑠 𝒑 − 𝒒 𝐺𝜎𝑟 𝐶𝒑 − 𝐶𝒒
𝐶𝒑 = (𝑅𝒑, 𝐺𝒑, 𝐵𝒑)
𝐺𝜎𝑠 𝒑 − 𝒒 𝐺𝜎𝑟 𝐶𝒑 − 𝐶𝒒 𝐵𝒒
𝐺𝜎𝑠 𝒑 − 𝒒 𝐺𝜎𝑟 𝐶𝒑 − 𝐶𝒒 𝐺𝒒
𝒑 = (𝑖, 𝑗)
𝒒 = (𝑖 + 𝑠, 𝑗 + 𝑡)
Gaussian filter vs. Bilateral filter
• Gaussian filter
̶ Weighted average of neighbors
̶ Depends only on spatial distance
̶ No edge term
Gaussian filter vs. Bilateral filter
• Bilateral filter
̶ Weighted average of neighbors
̶ Depends on spatial and range difference
Bilateral filter on a height field
output input
reproduced
from [Durand 02]
𝐺𝜎s ||𝐩 − 𝐪|| 𝐺𝜎r |𝐼𝐩 − 𝐼𝐪| 𝐼𝐪
Gaussian Noise Removal: Non-local Means Filtering
• Same goals:
̶ Smooth within Similar Regions
• KEY INSIGHT:
̶ Generalize, extend ‘Similarity’
• Bilateral:
̶ Averages neighbors with similar intensities;
• NL-Means:
̶ Averages neighbors with similar neighborhoods!
Gaussian Noise Removal: Non-local Means Filtering
• For each pixel p:
̶ Define a small, simple fixed size neighborhood;
̶ Define vector Vp: a list of neighboring pixel values.
Gaussian Noise Removal: Non-local Means Filtering
• For each pixel p:
̶ Define a small, simple fixed size neighborhood;
̶ Define vector Vp: a list of neighboring pixel values.
̶ ‘Similar’ pixels p, q → SMALL distance || Vp – Vq ||2
̶ ‘Dissimilar’ pixels p, r → LARGE distance || Vp – Vr ||2
Gaussian Noise Removal: Non-local Means Filtering
• For each pixel p:
̶ Define a small, simple fixed size neighborhood;
̶ Define vector Vp: a list of neighboring pixel values.
̶ ‘Similar’ pixels p, q → SMALL distance || Vp – Vq ||2
̶ ‘Dissimilar’ pixels p, r → LARGE distance || Vp – Vr ||2
• Filtering with this neighboring pixels!
̶ No spatial terms,
̶ Measures the distance between patches (neighbor pixels) p
𝑁𝐿𝑀𝐹[𝐼]𝐩 =
𝐺𝜎s ||𝐩 − 𝐪|| 𝐺𝜎r ||𝑉𝐩 − 𝑉𝐪||
Gaussian Noise Removal: Non-local Means Filtering
• Noisy source image:
Gaussian Noise Removal: Non-local Means Filtering
• Gaussian Filter
̶ Low noise
̶ Low detail
Gaussian Noise Removal: Non-local Means Filtering
• Bilateral Filter
̶ Better at noise removal
̶ but ‘stairsteps’
Gaussian Noise Removal: Non-local Means Filtering
• NL-Means
̶ Low noise
̶ Few artifacts
Non-local Means Filtering: Old-fashioned?
• Vision Transformer (in deep learning)
̶ A new architecture that shows state-of-the-art performance in computer vision tasks
𝐺𝜎r ||𝑉𝐩 − 𝑉𝐪||
Many applications, not limited to denoising
Tone Mapping [Durand 02]
Virtual Video Exposure [Bennett 05]
Flash / No-Flash [Eisemann 04, Petschnigg 04]
Tone Management [Bae 06]
Edge-aware smoothing filters
[Shen 2015]
Periodic Noise Removal: Frequency Domain Filtering
• Periodic fluctuation can be model
程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com