under aegis of board of studies electronics, sppu, …digital image and video processing (404184)...

Post on 14-Oct-2020

0 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Digital Image and Video Processing

(404184)

“FACULTY ORIENTATION WORKSHOP ON BE

REVISED SYLLABUS 2015 COURSE”

UNDER AEGIS OF BOARD OF STUDIES

ELECTRONICS, SPPU, PUNE

Presenter:

Mrs.Priya Charles Head E&TC

DYPIEMR,Pune

Structure

Unit I

Unit I difference

Unit I : Fundamentals of Image Processing (6L)(old)

Steps in image processing, Human visual system, Sampling & quantization, Representing digital images, Spatial & gray-level resolution, Image file formats, Basic relationships between pixels, Distance Measures. Basic operations on images-image addition, subtraction, logical operations, scaling, translation, rotation. Image Histogram. Color fundamentals & models – RGB, HSI ,YIQ.

Unit I : Fundamentals of Image Processing (5 Hrs)(new)

Steps in Image processing, Human visual system, Sampling & quantization, Representing digital images, spatial and gray level resolution, Image file formats, Basic relationships between pixels, Distance Measures, Basic operations on images image addition, subtraction, logical operations, scaling translation, rotation. Color fundamentals and models RGB, HIS, YIQ

1. Understand the fundamental concepts of Digital Image

Processing with basic relationship of pixels and mathematical

operations on 2-D data.

2. Learn design and integrate image enhancement and image

restoration techniques

3. Understand object segmentation and image analysis

techniques

4. Learn the need for effective use of resources such as

storage and bandwidth and ways to provide effective use of

them by data compression techniques

5. Learn basic concepts of video processing

Teaching Scheme: Lecture : 03 hrs/week

Course Objectives:

Course Outcomes

On completion of the course, student will be able to

1) Develop and implement basic mathematical operations on digital images.

2) Analyze and solve image enhancement and image restoration problems.

3) Identify and design image processing techniques for object segmentation and recognition.

4) Represent objects and region of the image with appropriate method.

5) Apply 2-D data compression techniques for digital images.

6) Explore video signal representation and different algorithm for video processing.

Practical

(Perform any 8 practical on appropriate software)

1. Perform basic operations on images.

2. Perform conversion between color spaces.

3. Perform histogram equalization.

4. Perform image filtering in spatial domain.

5. Perform image filtering in frequency domain.

6. Perform image restoration.

7. Perform image compression using DCT / Wavelet transform.

8. Perform edge detection using various masks.

9. Perform global and adaptive thresholding.

10. Apply morphological operators on an image.

11. Obtain boundary / regional descriptors of an image.

12. Extraction of frames from video, improve the quality and convert them back to compressed video.

Books

Text books:

Rafael C. Gonzalez and Richard E. Woods, “Digital Image

Processing”, Third Edition, - Pearson Education

Jain E G Richardson H.264 and MPEG

Video Compression: Video Coding for Next Publication, 3rd

Edition.

Reference Books:

1. A. K. Jain, Fundamentals of digital image processing, Prentice

Hall of India, 1989.

2. Pratt William K. "Digital Image Processing", John Wiley &

sons

3. A. Bovik, Handbook of Image & Video Processing, Academic

Press, 2000

Syllabus Mapping with Book

Sr. No. Contents Mapping

1 Steps in Image processing(1) T1- Chapter No. 1(25-28)

2 Human visual system(15 mins) T1- Chapter No. 2 (36-44)

3 Sampling & quantization, Representing digital images, spatial

and gray level resolution(1 hr)

T1- Chapter No. 2(52-65

4 Image file formats(15 mins) Additional:Pg-61

5 Basic relationships between pixels, Distance Measures(1 hr) T1- Chapter No. 2(68-72)

6 Basic operations on images image addition, subtraction, logical

operations, scaling translation, rotation.(1 hr)

T1- Chapter No. 2(72-95)

7 Color fundamentals and models RGB, HIS, YIQ(1/2 hr)

T1- Chapter No. 6(394-413)

T1:Rafael C. Gonzalez and Richard E. Woods, “Digital Image Processing”, Third

Edition, - Pearson Education

Additional:S Sridhar, “Digital Image Processing”, Oxford University Press

9

5 hrs

Digital image processing is the study of representation and

manipulation of pictorial information by a computer.

Improve pictorial information for better clarity (human

interpretation)

Examples:

1 Enhancing the edges of an image to make it appear

sharper

2 Remove “noise” from an image

3 Remove motion blur from an image

Introduction and Digital Image

Fundamentals

History of Digital Image

Processing

Early 1920s: One of the first applications of digital imaging

was in the news-

paper industry

The Bartlane cable picture transmission service

Images were transferred by submarine cable between London and

New York

Pictures were coded for cable transfer and reconstructed at the

receiving end on a telegraph printer

Early digital image

History of DIP

Mid to late 1920s: Improvements to the

Bartlane system resulted in higher quality images

New reproduction

processes based

on photographic

techniques

Increased number

of tones in

reproduced images Improved

digital

image Early 15 tone digital

image

History of DIP

1960s: Improvements in computing technology and the

onset of the space race led to a surge of work in digital image

processing

1964: Computers used to

improve the quality of

images of the moon taken

by the Ranger 7 probe

Such techniques were used

in other space missions

including the Apollo landings

A picture of the

moon taken by the

Ranger 7 probe

minutes before

landing

History of DIP

1970s: Digital image processing begins to be used in

medical applications

1979: Sir Godfrey N.

Hounsfield & Prof. Allan M.

Cormack share the Nobel

Prize in medicine for the

invention of tomography,

the technology behind

Computerised Axial

Tomography (CAT) scans

Typical head slice CAT

image

History of DIP

1980s - Today: The use of digital image processing

techniques has exploded and they are now used for

all kinds of tasks in all kinds of areas

Image enhancement/restoration

Artistic effects

Medical visualisation

Industrial inspection

Law enforcement

Human computer interfaces

Examples: Image

Enhancement

One of the most common uses of DIP techniques: improve quality,

remove noise etc

Examples: The Hubble

Telescope

Launched in 1990 the Hubble

telescope can take images of

very distant objects

However, an incorrect mirror

made many of Hubble’s

images useless

Image processing

techniques were

used to fix this

Examples: Artistic Effects

Artistic effects are used to

make images more visually

appealing, to add special

effects and to make

composite images

Examples: Medicine

Take slice from MRI scan of canine heart, and find boundaries between

types of tissue

Image with gray levels representing tissue density

Use a suitable filter to highlight edges

Original MRI Image of a Dog Heart Edge Detection Image

Examples: GIS

Geographic Information Systems

Digital image processing techniques are used extensively to manipulate

satellite imagery

Terrain classification

Meteorology

Examples: GIS

Night-Time Lights of the

World data set

Global inventory of human

settlement

Not hard to imagine the

kind of analysis that might

be done using this data

Examples: Industrial Inspection

Human operators are expensive,

slow and unreliable

Make machines do the job

instead

Industrial vision systems are

used in all kinds of industries

Examples: PCB Inspection

Printed Circuit Board (PCB) inspection

Machine inspection is used to determine that all components are

present and that all solder joints are acceptable

Both conventional imaging and x-ray imaging are used

Examples: Law Enforcement

Image processing techniques are

used extensively by law enforcers

Number plate recognition for speed

cameras/automated toll systems

Fingerprint recognition

Enhancement of CCTV images

Examples: HCI

Try to make human computer interfaces

more natural

Face recognition

Gesture recognition

These tasks can be extremely difficult

Visual Perception: Human Eye

(Picture from Microsoft Encarta 2000)

1. The lens contains 60-70% water, 6% of fat.

2. The iris diaphragm controls amount of light that enters the eye.

3. Light receptors in the retina

- About 6-7 millions cones for bright light vision called photopic

- Density of cones is about 150,000 elements/mm2.

- Cones involve in color vision. - Cones are concentrated in fovea about 1.5x1.5 mm2.

- About 75-150 millions rods for dim light vision called scotopic

- Rods are sensitive to low level of light and are not involved

color vision.

4. Blind spot is the region of emergence of the optic nerve from the eye.

Visual Perception: Human Eye (cont.)

Distribution of Rods and Cones in the Retina

(Images from Rafael C. Gonzalez and Richard E.

Wood, Digital Image Processing, 2nd Edition.

Image Formation in the Human Eye

(Picture from Microsoft Encarta 2000)

(Images from Rafael C. Gonzalez and Richard E.

Wood, Digital Image Processing, 2nd Edition.

Position

Inte

nsi

ty

Brightness Adaptation of Human Eye : Mach Band Effect

Mach Band Effect

Intensities of surrounding points

effect perceived brightness at each

point.

In this image, edges between bars

appear brighter on the right side

and darker on the left side.

(Images from Rafael C. Gonzalez and Richard E.

Wood, Digital Image Processing, 2nd Edition.

In area A, brightness perceived is darker while in area B is

brighter. This phenomenon is called Mach Band Effect.

Position

Inte

nsi

ty

A B

Mach Band Effect (Cont)

Mind Map Exercise: Mind Mapping For Note Taking

Beau Lotto: Optical Illusions Show How We See

http://www.ted.com/talks/lang/eng/beau_lotto_optical_illusions_show_how_we_see.html

Image “After snow storm”

Fundamentals of Digital Images

f(x,y)

x

y

w An image: a multidimensional function of spatial coordinates.

w Spatial coordinate: (x,y) for 2D case such as photograph,

(x,y,t) for movies

w The function f may represent intensity (for monochrome images)

or color (for color images) or other associated values.

Origin

Digital Image Representation

Key steps in Digital Image Processing

Image

Acquisition

Image

Restoration

Morphological

Processing

Segmentation

Object

Recognition

Image

Enhancement

Representatio

n &

Description

Problem Domain

Color Image

Processing

Image

Compression

Knowledge Base

Key Stages in Digital Image Processing: Image Acquisition

Image

Acquisition

Image

Restoration

Morphological

Processing

Segmentation

Object

Recognition

Image

Enhancement

Representatio

n &

Description

Problem Domain

Color Image

Processing

Image

Compression

Key Stages in Digital Image Processing:

Image Enhancement

Image

Acquisition

Image

Restoration

Morphological

Processing

Segmentation

Representatio

n &

Description

Image

Enhancement

Object

Recognition Problem Domain

Color Image

Processing

Image

Compression

Key Stages in Digital Image Processing: Image Restoration

Image

Acquisition

Image

Restoration

Morphological

Processing

Segmentation

Object

Recognition

Image

Enhancement

Representatio

n &

Description

Problem Domain

Color Image

Processing

Image

Compression

Key Stages in Digital Image Processing: Morphological Processing

Image

Acquisition

Image

Restoration

Morphological

Processing

Segmentation

Representatio

n &

Description

Image

Enhancement

Object

Recognition

Problem Domain

Color Image

Processing

Image

Compression

Key Stages in Digital Image Processing: Segmentation

Image

Acquisition

Image

Restoration

Morphological

Processing

Segmentation

Object

Recognition

Image

Enhancement

Representatio

n &

Description

Problem Domain

Color Image

Processing

Image

Compression

Key Stages in Digital Image Processing: Representation & Description

Image

Acquisition

Image

Restoration

Morphological

Processing

Segmentation Image

Enhancement

Problem Domain

Color Image

Processing

Image

Compression

Representation

& Description

Object

Recognition

Key Stages in Digital Image Processing: Object Recognition

Image

Acquisition

Image

Restoration

Morphological

Processing

Segmentation

Object

Recognition

Image

Enhancement

Representation

&

Description

Problem Domain

Color Image

Processing

Image

Compression

Key Stages in Digital Image Processing: Image Compression

Image

Acquisition

Image

Restoration

Morphological

Processing

Segmentation

Object

Recognition

Image

Enhancement

Representation

&

Description

Problem Domain

Color Image

Processing

Image

Compression

Key Stages in Digital Image Processing: Colour Image Processing

Image

Acquisition

Image

Restoration

Morphological

Processing

Segmentation

Object

Recognition

Image

Enhancement

Representation

&

Description

Problem Domain

Color Image

Processing

Image

Compression

Digital Image Types : Intensity Image

Intensity image or monochrome image

each pixel corresponds to light intensity

normally represented in gray scale (gray

level).

39871532

22132515

372669

28161010

Gray scale values

39871532

22132515

372669

28161010

39656554

42475421

67965432

43567065

99876532

92438585

67969060

78567099

Digital Image Types : RGB Image

Color image or RGB image:

each pixel contains a vector

representing red, green and

blue components.

RGB components

Image Types : Binary Image

Binary image or black and white image

Each pixel contains one bit :

1 represent white

0 represents black

1111

1111

0000

0000

Binary data

Gray Level and Color Images

A Gray Level Image is a Matrix

f(0,0) f(0,1) f(0,2) …. …. f(0,n-1)

f(1,0) f(1,1) f(1,2) …. …. f(1,n-1)

. . .

. . .

. . .

f(m-1,0) f(m-1,1) f(m-1,2) … …. f(m-1,n-1)

An image of m rows, n columns, f(i,j) is in [0,255]

Gray and Color Image Data

0, 64, 144, 196,

225, 169, 100, 36

(R, G, B) for a color pixel

Red – (255, 0, 0)

Green – ( 0, 255, 0)

Blue – ( 0, 0, 255)

Cyan – ( 0,255, 255)

Magenta – (255, 0, 255)

Yellow – (255, 255, 0)

Gray – (128, 128, 128)

How to choose the spatial resolution

= Sampling locations

Ori

gin

al i

mag

e S

ample

d i

mag

e

Under sampling, we lost some image details!

Spatial resolution

How to choose the spatial resolution : Nyquist Rate O

rigin

al i

mag

e

= Sampling locations

Minimum

Period Spatial resolution

(sampling rate)

Sampled image

No detail is lost!

Nyquist Rate:

Spatial resolution must be less or equal

half of the minimum period of the image

or sampling frequency must be greater or

Equal twice of the maximum frequency.

2mm

1m

m

Effect of Spatial Resolution

256x256 pixels

64x64 pixels

128x128 pixels

32x32 pixels

Effect of Spatial Resolution

Effect of Spatial Resolution

Can we increase spatial resolution by interpolation ?

Down sampling is an irreversible process.

Image Quantization

Image quantization:

discretize continuous pixel values into discrete numbers

Color resolution/ color depth/ levels:

- No. of colors or gray levels or

- No. of bits representing each pixel value

- No. of colors or gray levels Nc is given by

b

cN 2

where b = no. of bits

Quantization function

Light intensity

Qu

anti

zati

on l

evel

0

1

2

Nc-1

Nc-2

Darkest Brightest

Intensity Level Resolution

Intensity level resolution refers to the number of intensity levels used to

represent the image

The more intensity levels used, the finer the level of

detail discernable in an image

Intensity level resolution is usually given in terms of

the number of bits used to store each intensity level

Number of Bits Number of Intensity

Levels Examples

1 2 0, 1

2 4 00, 01, 10, 11

4 16 0000, 0101, 1111

8 256 00110011, 01010101

16 65,536 1010101010101010

Effect of Quantization Levels

256 levels 128 levels

32 levels 64 levels

Effect of Quantization Levels (cont.)

16 levels 8 levels

2 levels 4 levels

In this image,

it is easy to see

false contour.

Zooming and

shrinking

Common image file formats

PGM (Portable Gray Map)

Bit Map File

PNG (Portable Network Graphics)

GIF (Graphic Interchange Format) –

JPEG (Joint Photographic Experts Group)

TIFF (Tagged Image File Format)

FITS (Flexible Image Transport System)

Basic Relationship of Pixels

x

y

(0,0)

Conventional indexing method

(x,y) (x+1,y) (x-1,y)

(x,y-1)

(x,y+1)

(x+1,y-1) (x-1,y-1)

(x-1,y+1) (x+1,y+1)

Neighbors of a Pixel

p (x+1,y) (x-1,y)

(x,y-1)

(x,y+1)

4-neighbors of p:

N4(p) =

(x-1,y)

(x+1,y)

(x,y-1)

(x,y+1)

Neighborhood relation is used

to tell adjacent pixels. It is

useful for analyzing regions.

Note: q N4(p) implies p N4(q)

4-neighborhood relation considers only vertical and

horizontal neighbors.

p

(x+1,y-1) (x-1,y-1)

(x-1,y+1) (x+1,y+1)

Diagonal neighbors of p:

ND(p) =

(x-1,y-1)

(x+1,y-1)

(x-1,y+1)

(x+1,y+1)

Neighbors of a Pixel (cont.)

Diagonal -neighborhood relation considers only diagonal

neighbor pixels.

p (x+1,y) (x-1,y)

(x,y-1)

(x,y+1)

(x+1,y-1) (x-1,y-1)

(x-1,y+1) (x+1,y+1)

Neighbors of a Pixel (cont.)

8-neighbors of p:N8(p)=N4(p) U

ND(p)

(x-1,y-1)

(x,y-1)

(x+1,y-1)

(x-1,y)

(x+1,y)

(x-1,y+1)

(x,y+1)

(x+1,y+1)

N8(p) =

8-neighborhood relation considers all neighbor pixels.

Connectivity

Connectivity is an important concept to find the region

property of an image or the property of a particular region within

the image.

It is used for

Establishing object boundaries

Defining image components/regions etc

For p and q from the same class

w 4-connectivity: p and q are 4-connected p,q Î V & q Î N4(p)

w 8-connectivity: p and q are 8-connected p,q Î V & q Î N8(p)

w mixed-connectivity (m-connectivity):

p and q are m-connected if q Î N4(p) or

q Î ND(p) and N4(p) Ç N4(q) = Æ

Either q has to be a 4 neighbor of p or p has to be a 4 neighbor of q

Or q has to be a diagonal neighbor of p, but at the same time N4 (p) intersection with

N4(q) must be equal to Æ

N4(p) Ç N4(q)

this indicates the set of points which are 4 neighbors of both the

points p and q

If the point q belongs to the diagonal neighbor of p and there is a

common set of points which have 4 neighbors to both the points p

and q then M connectivity is not valid

Ex: V={1}

0 1 1

0 1

0 1

0 1 1

0 1

0 0 1

0 1 1

0 1 0

0 1

4 connected 8 connected M connected

In case of M connectivity the two points are M

connected if one is the 4 neighbor of the other,

Or

one is the 4 neighbor of the other and at the same time

they don’t have any common neighbor.

Path (cont.)

p

q

p

q

p

q

8-path from p to q

results in some ambiguity

m-path from p to q

solves this ambiguity

8-path m-path

Distance

For pixel p, q, and z with coordinates (x,y), (s,t) and (u,v),

D is a distance function or metric if

w D(p,q) 0 (D(p,q) = 0 if and only if p = q)

w D(p,q) = D(q,p)

w D(p,z) D(p,q) + D(q,z)

Example: Euclidean distance

22 )()(),( tysxqpDe -+-

Distance (cont.)

D4-distance (city-block distance) is defined as

tysxqpD -+-),(4

1 2

1 0

1 2

1

2

2

2

2

2

2

Pixels with D4(p) = 1 is 4-neighbors of p.

Distance (cont.)

D8-distance (chessboard distance) is defined as

),max(),(8 tysxqpD --

1

2

1 0

1

2

1

2

2

2

2

2

2

Pixels with D8(p) = 1 is 8-neighbors of p.

2 2

2

2

2

2 2 2

1

1

1

1

Basic operations on images

Arithmetic

Logical

Geometric

Addition

Subtraction

Multiplication

division

Brightening an

image

detecting the missing

components to mask the image for

obtaining region of interest

decrease the brightness

of the image

AND

OR

NOT

XOR

To isolate the interested

region from rest of the

image

Negative of image Detect change in images

translation

Rotation

scaling

Arithmetic and Logic Operations

a b

NOT(a)

a . b

a + b

Basic arithmetic operations on images

Arithmetic and Logic Operations

. =

+ =

Image Subtraction

(a) original fractal image.

(b) Result of setting the

four lower-order bit planes to zero. (c)

Difference between (a)

and (b) . (d) Histogram

equalized difference

image.

a b

c d

© 2002 R. C. Gonzalez & R. E. Woods

Colour Fundamentals

In 1666 Sir Isaac Newton discovered that when a beam of sunlight passes through

a glass prism, the emerging beam is split into a spectrum of colours

Color Spectrum

Band of visible light is relatively narrow in the band of frequencies

in the electromagnetic spectrum.

Perception (Cont.)

Primary Colors

The cone cells in human eye can be divided into

three categories, corresponding roughly to red, green

and blue (Figure 6.3).

Due to these characteristics of the human eye, colors

are seen as variable combinations of the primary

colors red (700 nm), green (546.1 nm), and blue

(435.8 nm).

Standardized in 1931.

This standardization does not mean these three primary

colors can generate all spectrum colors.

Secondary Colors

The primary colors can be

added to produce the

secondary colors of light:

Cyan, Magenta, Yellow.

The primary colors of

pigments are cyan, magenta,

and yellow, while the

secondary colors are red,

green, and blue.

More Fundamentals

The characteristics generally used to distinguish one color from another are hue, saturation, and brightness.

Hue: associated with color as perceived by an observer.

Saturation: relative purity or the amount of white light mixed with a hue.

Brightness: intensity of light.

Hue and saturation are taken together are called chromaticity; therefore, a color can be characterized by its chromaticity and brightness.

CIE Chromacity Diagram (cont…)

Green: 62% green, 25%

red and 13% blue

Red: 32% green, 67% red

and 1% blue

Colour Models

From the previous discussion it should be obvious that there

are different ways to model colour

We will consider two very popular models used in colour

image processing:

RGB (Red Green Blue)

HIS (Hue Saturation Intensity)

Converting From RGB To HSI

Given a colour as R, G, and B its H, S, and I values are calculated as

follows:

H if B G

360- if B G

cos-1

12R -G + R - B

R -G 2+ R - B G - B

12

S 1-3

R+G + B min R,G,B

I 13R+G+ B

Converting From HSI To RGB

Given a colour as H, S, and I it’s R, G, and B values are calculated as follows:

RG sector (0 <= H < 120°)

GB sector (120° <= H < 240°)

G 3I - R+ B

B I 1- S

R I 1+ScosH

cos 60-H

B 3I - R+G

R I 1- S

G I 1+Scos H -120 cos H -60

Converting From HSI To RGB (cont…)

BR sector (240° <= H <= 360°)

R 3I - G+ B

G I 1- S

B I 1+Scos H -240 cos H -180

RGB -> HSI -> RGB

RGB

Image

Saturation

Hue

Intensity

RGB -> HSI -> RGB (cont…)

Hue

Intensity

Saturation

RGB

Image

Questions

1. Explain components of image processing system with neat diagram.[8]

2. Define MTF. Explain it for the Human Vision[8]

3. Explain with neat diagrams the various mechanisms for image acquisition[8]

4. Explain the following in context of human vision[8]

1. Luminance & Brightness

2. MTF

5. With the help of neat diagram explain various steps in image processing [8]

6. Explain the concept of Image sampling and quantization with suitable sketch[8]

7. Explain the following with respect to digital image.

1. Spatial and gray level resolution

2. Profile and standard deviation

Unit II

Unit II difference

Unit II: Image Enhancement and Restoration (6L)(old)

Spatial domain enhancement: Point operations-Log transformation, Power-law

transformation, Piecewise linear transformations, Histogram equalization. Filtering

operations- Image smoothing, Image sharpening. Frequency domain enhancement:

2D DFT, Smoothing and Sharpening in frequency domain, Homomorphic filtering.

Restoration: Noise models, Restoration using Inverse filtering and Wiener filtering

Unit II : Fundamentals of Image Processing (8 Hrs)(new)

Point Log transformation, Power law transformation, Piecewise linear transformation,

Image histogram, histogram equalization, Mask processing of images, filtering

operations- Image smoothing, image sharpening, frequency domains image

enhancement: 2D DFT, smoothing and sharpening in frequency domain, Pseudo

coloring.

Image Restoration: Noise models, restoration using Inverse filtering and Wiener

filtering

Syllabus Mapping with Book

Sr. No. Contents Mapping

1 Point Log transformation, Power law transformation, Piecewise

linear transformation,,.

T1- Chapter No. 3(104-119)

2 Image histogram, histogram equalization T1- Chapter No. 3(122-144)

R1-pg-241

3 Mask processing of images, filtering operations- Image

smoothing, image sharpening, frequency domains image

enhancement: 2D DFT, smoothing and sharpening in frequency

domain

T1- Chapter No. 3(144-

167),chapter 4-(pgs 220-242)

Chapter-4(255-288)

4 Pseudo coloring R1-chapter 7 Pg-6262

5 Image Restoration: Noise models, restoration using Inverse

filtering and Wiener filtering

T1- Chapter No. 5(311-356)

T1:Rafael C. Gonzalez and Richard E. Woods, “Digital Image Processing”, Third

Edition, - Pearson Education

R1A. K. Jain, Fundamentals of digital image processing, Prentice Hall of India,

1989.

99

8 hrs

What Is Image Enhancement?

Image enhancement is the process of making

images more useful

The reasons for doing this include:

– Highlighting interesting detail in images

– Removing noise from images

– Making images more visually appealing

Image Enhancement Examples

Image Enhancement Examples

Spatial & Frequency Domains

There are two broad categories of image

enhancement techniques

– Spatial domain techniques

• Direct manipulation of image pixels

– Frequency domain techniques

• Manipulation of Fourier transform or wavelet transform

of an image

Conten

ts – What is point processing?

– Negative images

– Thresholding

– Logarithmic transformation

– Power law transforms

– Grey level slicing

– Bit plane slicing

Basic Spatial Domain Image Enhancement

Origin x

y Image f (x, y)

(x, y)

•Most spatial domain enhancement operations can

be reduced to the form

•g (x, y) = T[ f (x, y)]

•where f (x, y) is the

input image, g (x, y) is

the processed image and

T is some operator

defined over some

neighbourhood of (x, y)

Point Processing

•The simplest spatial domain operations occur when the neighbourhood is simply the pixel itself

•In this case T is referred to as a grey level transformation function or a point processing operation

•Point processing operations take the form

•s = T ( r )

•where s refers to the processed image pixel value

and r refers to the original image pixel value

Basic Grey Level Transformations

•There are many different kinds of grey level transformations

1)Linear

• Negative/Identity

2)Logarithmic • Log/Inverse log

3)Power law • nth power/nth root

Point Processing

Example: Negative

Images (cont…) Original Image x

y Image f (x, y)

Enhanced Image x

y Image f (x, y)

s = intensitymax - r

Piecewise Linear Transformation

Functions

Case 1: Contrast Stretching

Case 2:Gray-level Slicing

Case 3:Bit-plane Slicing

Image Histograms

The histogram of an image shows us the

distribution of grey levels in the image

Massively useful in image processing, especially in

segmentation

Grey Levels

Fre

quenci

es

Histogram Examples (cont…)

Histogram Examples (cont…)

Histogram Examples (cont…)

Histogram Examples (cont…)

Histogram Equalisation

Spreading out the frequencies in an image (or equalising the image) is a simple way to improve dark or washed out images

The formula for histogram equalisation is given where

– rk: input intensity

– sk: processed intensity

– k: the intensity range (e.g 0.0 – 1.0)

– nj: the frequency of intensity j

– n: the sum of all frequencies

sk T (rk ) k

j 1

pr (rj )

k j

n

n

j 1

The mode is the value that occurs most frequently in a distribution and is usually the highest point on the curve (histogram). It is common, however, to encounter more than one mode in a remote sensing dataset.

The median is the value midway in the frequency distribution. One-half of the area below the distribution curve is to the right of the median, and one-half is to the left

The mean is the arithmetic average and is defined as the sum of all brightness value observations divided by the number of observations.

Example

96

2 3 3 2

4 2 4 3

3 2 3 5

2 4 2 4

4x4 image

Gray scale = [0,9]

6

5

4

3

2

1

No. of pixels

Gray level

0 1 2 3 4 5 6 7 8 9

histogram

Gray

Level(j)

0

1

2

3

4

5

6

7

8

9

No. of pixels

0

0

6

5

4

1

0

0

0

0

k

n j j 0

0

0

6

11

15

16

16

16

16

16

s n j

k 0

0

6 /

11

/

15

/

16

/

16

/

16

/

16

/

16

/

j 0 n 16 16 16 16 16 16 16 16

s x 9

0

0

3.3

3

6.1

6

8.4

8

9

9

9

9

9

Exampl

e

Example

98

3 6 6 3

8 3 8 6

6 3 6 9

3 8 3 8

Gray scale = [0,9]

Histogram equalization

No. of pixels

6

5

4

3

2

1

Output image

0 1 2 3 4 5 6 7 8 9 Gray level

Simple Neighbourhood Operations

Example

164 170 175 162 173 151

Original Image

123 127 128 119 115 130

140 145 148 153 167 172

133 154 183 192 194 191

194 199 207 210 198 195

x

y

Enhanced Image x

y

The Spatial Filtering Process

r s t

u v w

x y z

Origin x

y Image f (x, y)

eprocessed = v*e +

r*a + s*b + t*c +

u*d + w*f +

x*g + y*h + z*i

Filter Simple 3*3

Neighbourhood e 3*3 Filter

a b c

d e f

g h i

Original Image

Pixels

*

The above is repeated for every pixel in the original

image to generate the filtered image

Smoothing Spatial Filters

•One of the simplest spatial filtering operations we can perform is a smoothing operation

– Simply average all of the pixels in a neighbourhood around a central value

– Especially useful in removing noise from images

– Also useful for highlighting gross detail

1/9 1/9

1/9

1/9 1/9

1/9

1/9 1/9

1/9

Simple

averaging filter

Smoothing Spatial

Filtering 1/9

1/9 1/9

1/9 1/9

1/9

1/9 1/9

1/9

Origin x

y Image f (x, y)

e = 1/9*106 + 1/9*104 + 1/9*100 + 1/9*108 + 1/9*99 + 1/9*98 + 1/9*95 + 1/9*90 + 1/9*85

Filter Simple 3*3

Neighbourhood 106

104

99

95

100 108

98

90 85

1/9 1/9

1/9

1/9 1/9

1/9

1/9 1/9

1/9

3*3 Smoothing

Filter

104 100 108

99 106 98

95 90 85

Original Image

Pixels

*

= 98.3333

The above is repeated for every pixel in the original

image to generate the smoothed image

Weighted Smoothing Filters

•More effective smoothing filters can be

generated by allowing different pixels in the

neighbourhood different weights in the

averaging function

– Pixels closer to the

central pixel are more

important

– Often referred to as a

weighted averaging

1/16 2/16

1/16

2/16 4/16

2/16

1/16 2/16

1/16

Weighted

averaging filter

Sharpening Spatial Filters

Previously we have looked at smoothing filters

which remove fine detail

Sharpening spatial filters seek to highlight fine

detail

– Remove blurring from images

– Highlight edges

Sharpening filters are based on spatial

differentiation

Spatial Differentiation

Differentiation measures the rate of change of a

function

Let’s consider a simple 1 dimensional example

Spatial Differentiation

A B

1st Derivative

It’s just the difference between subsequent

values and measures the rate of change of the

function

f f (x +1) - f (x)

x

The formula for the 1st derivative of a function is as follo

ws:

• Requirement of first order derivative:

– Must be zero in flat segment

– Must be nonzero at the onset of a gray level step

or ramp

– Must be nonzero along ramps.

2nd Derivative

Simply takes into account the values both before

and after the current value

f (x +1) + f (x -1) - 2 f (x) 2 x

The formula for the 2nd derivative of a function is as fo

llo2wf

s:

• Requirement of second order derivative:

– Must be zero in flat reas

– Must be nonzero at the onset and at the end of a

gray level step or ramp

– Must be zero along ramps.

Using Second Derivatives For

Image

Enhancement The 2nd derivative is more useful for image enhancement than the 1st derivative

– Stronger response to fine detail

– Simpler implementation

The first sharpening filter we will look at is the Laplacian

– Isotropic(rotation invariant)

– One of the simplest sharpening filters

– We will look at a digital implementation

Variants On The Simple

Laplacian There are lots of slightly different versions of the

Laplacian that can be used: 0 1 0

1 -4 1

0 1 0

1 1 1

1 -8 1

1 1 1

-1 -1 -1

-1 9 -1

-1 -1 -1

Simple

Laplacian

Variant of

Laplacian

Sobel Operators

To filter an image it is filtered using both operators

the results of which are added together

-1 -2 -1

0 0 0

1 2 1

-1 0 1

-2 0 2

-1 0 1

The Two-Dimensional DFT and Its Inverse

(a)f(x,y) (b)F(u,y) (c)F(u,v)

The 2D DFT F(u,v) can be obtained by

1. taking the 1D DFT of every row of image f(x,y), F(u,y),

2. taking the 1D DFT of every column of F(u,y)

Basics of Filtering in the Frequency Domain

Some Basic Filters and Their Functions

Lowpass filter

Highpass filter

Ideal Lowpass Filters (ILPFs)

• The simplest lowpass filter is a filter that “cuts off” all high-

frequency components of the Fourier transform that are at a

distance greater than a specified distance D0 from the origin of

the transform.

• The transfer function of an ideal lowpass filter

where D(u,v) : the distance from point (u,v) to the center of

ther frequency rectangle

2

1 2 2 D(u, v) (u - M / 2) + (v - N / 2)

if D(u, v) D0

0 if D(u, v) D0

H (u, v) 1

Ideal Lowpass Filters (ILPFs)

Ideal Lowpass Filters (ILPFs)

Ideal Lowpass Filters

Butterworth Lowpass Filters (BLPFs) With order n

2n

0

1

1+ D(u, v) / D H (u, v)

Butterworth Lowpass Filters (BLPFs)

n=2

D0=5,15,30,80,and 230

Gaussian Lowpass Filters (FLPFs)

2 0

2 - D (u,v) / 2D H (u, v) e

Gaussian Lowpass Filters (FLPFs)

D0=5,15,30,80,and 230

Additional Examples of Lowpass Filtering

Sharpening Frequency Domain Filter

Hhp (u, v) Hlp (u, v)

Ideal highpass filter

Butterworth highpass filter

Gaussian highpass filter

if D(u, v) D0

1 if D(u, v) D0

H (u, v) 0

2n

0

1

1+ D / D(u, v) H (u, v)

2 0

2 -D (u,v) / 2D H (u, v) 1- e

Highpass Filters Spatial Representations

Ideal Highpass Filters

1 if D(u, v) D0

if D(u, v) D0 H (u, v)

0

Butterworth Highpass Filters

2n

0

1

1+ D / D(u, v) H (u, v)

Gaussian Highpass Filters

2 0

2 -D (u,v) / 2D H (u, v) 1- e

Image restoration

A model of image

degradation/restoration

process

Electronic noise/poor

illumination

Laser imaging Range imaging (sensor)

Laser imaging Quick transients (faulty switching)

• Periodic noise reduction by

– Spatial filters

– Frequency domain filters

Approach f(x,y)

Build

degradation model

Formulate

restoration algorithms

f(x,y)

Analyze using

algebraic techniques

Implement using

Fourier transforms

g = h*f + n

g = Hf + n

W -1 g = DW -1 f + W -1 n

f = H -1 g

F(u,v) = G(u,v)/H(u,v)

Unit 2 Questions

Q1)

Q2)

Q3)

Q4)

Q5)

Q6)

Q7)

Q8)

Q9)

Q10)

Q11)

Q12)

Q13)

Q14)

Q15)

Q16)

Q17)

Q18)

Q19)

top related