pr-lecture 6-7

58
ا علمتنا مم لنا إ عل سبحانكلوا قاLecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition ا علمتنا مم لنا إ عل سبحانكلوا قاــيــم الــحـــكــيــمـعــل إنــــك أنــــت ال صدق ﷲ العظيم

Upload: -

Post on 18-Jul-2016

9 views

Category:

Documents


0 download

DESCRIPTION

PR-lecture 6-7

TRANSCRIPT

Page 1: PR-lecture 6-7

قالوا سبحانك علم لنا إ ما علمتنا

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

قالوا سبحانك علم لنا إ ما علمتناإنــــك أنــــت الـعــلــيــم الــحـــكــيــم

صدق هللا العظيم

Page 2: PR-lecture 6-7

والص ة والس م علي اشرف خلق هللاوالص ة والس م علي اشرف خلق هللا نبينا سيدنا محمد صلي هللا عليه وسلمنبينا سيدنا محمد صلي هللا عليه وسلم

سبحانك اللھم وبحمدك

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

سبحانك اللھم وبحمدكاشھد أن هللا إ أنتاستغفرك وأتوب اليك

Page 3: PR-lecture 6-7

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

Prof. Dr Sayed Fadel BahgatProf. Dr Sayed Fadel Bahgat

[email protected]@yahoo.com

Page 4: PR-lecture 6-7

Chapter two

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

Page 5: PR-lecture 6-7

• Homogeneity (HOM)• (also called • "Inverse Difference Moment")

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

• Homogeneity weights values by the inverse of the Contrast weight, with weights decreasing exponentially away from the diagonal:

Page 6: PR-lecture 6-7

Homogeneity equation =

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

Page 7: PR-lecture 6-7

• Exercise:• Calculate the homogeneity value for the

horizontal GLCM and compare it with the Dissimilarity value.

• Homogeneity calculation:• Homogeneity weights X horizontal GLCM

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

• Homogeneity weights X horizontal GLCM

• Homogeneity equation =

Page 8: PR-lecture 6-7

1 0.5 0.2 0.1

0.5 1 0.5 0.2

0.2 0.5 1 0.5

0.1 0.2 0.5 1

X

(1+(i-j) 2)-1 Pij

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition محاضرات ا�ستاذ الدكتور سيد فاضل بھجت

=

Page 9: PR-lecture 6-7

X

(1+(i-j) 2)-1 Pij

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

=

Page 10: PR-lecture 6-7

0.166 0.083 0.042 0

0.083 .166 0 0

Normalized symmetrical horizontal GLCM matrix

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

0.083 .166 0 0

0.042 0 0.25 0.042

0 0 .042 0.083

Page 11: PR-lecture 6-7

1 0.5 0.2 0.1

0.5 1 0.5 0.2

0.2 0.5 1 0.5

0.1 0.2 0.5 1

0.166 0.083 0.042 0

0.083 .166 0 0

0.042 0 0.25 0.042

0 0 .042 0.083

X

(1+(i-j) 2)-1 Pij

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

0.166 0.042 .008 0

=

Page 12: PR-lecture 6-7

1 0.5 0.2 0.1

0.5 1 0.5 0.2

0.2 0.5 1 0.5

0.1 0.2 0.5 1

0.166 0.083 0.042 0

0.083 .166 0 0

0.042 0 0.25 0.042

0 0 .042 0.083

0.166 0.042 .008 0

X

(1+(i-j) 2)-1 Pij

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

0.166 0.042 .008 0

0.042 0.166 0 0

0.008 0 0.25 0.021

0 0 0.021 0.083

=

sum of multiplication results matrix = .807

Page 13: PR-lecture 6-7

• sum of multiplication results matrix = .807 In non-matrix form:

• 0.166(1)+ 0.083(0.5)+ 0.042(0.2)+ 0(0.1) + 0.083(0.5)+ 0.166(1) + 0(0.5)+ 0(0.2) + 0.042(0.2)+ 0(0.5) + 0.250(1)+0 .042(0.5) +0(0.1)+ 0(0.2)+0.042(0.5)

• + .083(1)1

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

• = 0.166 + 0.042 + 0.008 + 0 + 0.042 + 0.166 + 0 + 0

• + 0.008 + 0 + 0.250 + 0.021 • + 0 + 0 + 0.021 +0 .083• = .807

Page 14: PR-lecture 6-7

Simple Analysis of Texture

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

Page 15: PR-lecture 6-7

• Homogeneity is the most commonly used measure that increases with less contrast in the window.

• However, it would be easy to use the above model to construct a first degree "similarity" measure. Write the equation

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

"similarity" measure. Write the equation and perform the calculation for "similarity" using the horizontal GLCM

Page 16: PR-lecture 6-7

• Homogeneity is the most commonly used measure that increases with less contrast in the window.

• However, it would be easy to use the above model to construct a first degree "similarity" measure

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

"similarity" measure

•Equation =

Page 17: PR-lecture 6-7

Exercise:Calculate the similarity value for the

horizontal GLCM.Similarity calculation:

= "Similarity weights X horizontal GLCM= multiplication results

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

= multiplication results

Page 18: PR-lecture 6-7

X

(1+|i-j|) -1 Pij

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

=

Page 19: PR-lecture 6-7

1 0.5 0.333 0.25

0.5 1 0.5 0.333

0.333 0.5 1 0.5

0.25 0.333 0.5 1

X

Pij(1+|i-j|) -1

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

=

Page 20: PR-lecture 6-7

0.166 0.083 0.042 0

0.083 .166 0 0

Normalized symmetrical horizontal GLCM matrix

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

0.083 .166 0 0

0.042 0 0.25 0.042

0 0 .042 0.083

Page 21: PR-lecture 6-7

1 0.5 0.333 0.25

0.5 1 0.5 0.333

0.333 0.5 1 0.5

0.25 0.333 0.5 1

0.166 0.083 0.042 0

0.083 .166 0 0

0.042 0 0.25 0.042

0 0 .042 0.083

X

Pij(1+|i-j|) -1

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

0.166 0.042 .014 0

=

Page 22: PR-lecture 6-7

1 0.5 0.333 0.25

0.5 1 0.5 0.333

0.333 0.5 1 0.5

0.25 0.333 0.5 1

0.166 0.083 0.042 0

0.083 .166 0 0

0.042 0 0.25 0.042

0 0 .042 0.083

0.166 0.042 .014 0

X

Pij(1+|i-j|) -1

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

0.166 0.042 .014 0

0.042 0.166 0 0

0.014 0 0.25 0.021

0 0 0.021 0.083

=

sum of multiplication results matrix = .807

Page 23: PR-lecture 6-7

sum of multiplication results matrix = .807 In non-matrix form:0.166(1)+ 0.083(0.5)+ 0.042(0.333)+ 0(0.25) + 0.083(0.5)+ 0.166(1) + 0(0.5)+ 0(0.333) + 0.042(0.333)+ 0(0.5) + 0.250(1)+0 .042(0.5) +0(0.25)+ 0(0.333)+0.042(0.5) + .083(1)

= 0.166 + 0.042 + 0.014 + 0 + 0.083 + 0.166 + 0 + 0

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

+ 0.083 + 0.166 + 0 + 0 + 0.014 + 0 + 0.250 + 0.021 + 0 + 0 + 0.021 + 0 .083

= .804

Page 24: PR-lecture 6-7

Simple Analysis of Texture

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

Page 25: PR-lecture 6-7

Simple Analysis of Texture

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

Page 26: PR-lecture 6-7

• Angular Second Moment (ASM) and Energy (also called Uniformity)

• ASM and Energy use each Pij as a weight for itself. High values of ASM or Energy occur when the window is very orderly.

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

occur when the window is very orderly.

Page 27: PR-lecture 6-7

ASM equation =

The square root of the ASM is sometimes used as a texture measure, and is called Energy.

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

Energy.

Page 28: PR-lecture 6-7

0.166 0.083 0.042 0

0.083 .166 0 0

Normalized symmetrical horizontal GLCM matrix

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

0.083 .166 0 0

0.042 0 0.25 0.042

0 0 .042 0.083

Page 29: PR-lecture 6-7

• Exercise: Perform the ASM calculation for the horizontal GLCM.

• Answer. matrix of ( Pij )2:

0.027 0.007 0.002 0

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

0.027 0.007 0.002 0

0.007 0.028 0 0

0.002 0 0.0625 0.002

0 0 0.002 0.007

summed = 0 .145

Page 30: PR-lecture 6-7

Simple Analysis of Texture

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

Page 31: PR-lecture 6-7

Simple Analysis of Texture

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

Page 32: PR-lecture 6-7

Simple Analysis of Texture

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

Page 33: PR-lecture 6-7

Entropy equation =

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

Entropy equation =

Page 34: PR-lecture 6-7

• Entropy =• ln(P )* horizontal GLCM * (-1)

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

• ln(Pij)* horizontal GLCM * (-1)• = multiplication results

Page 35: PR-lecture 6-7

0.166 0.083 0.042 0

Normalized symmetrical horizontal GLCM matrix

(Pi,j)

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

0.083 .166 0 0

0.042 0 0.25 0.042

0 0 .042 0.083

Page 36: PR-lecture 6-7

-1.7957 -2.4889 -3.1700 0

-2.4889 -1.7957 0 0

ln ( Pi,j)

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

-2.4889 -1.7957 0 0

-3.1700 0 -1.38 -3.1700

0 0 -3.1700 -2.4889

Page 37: PR-lecture 6-7

0.166 0.083 0.042 0

0.083 .166 0 0

0.042 0 0.25 0.042

0 0 .042 0.083

1.7957 2.4889 3.1700 0

2.4889 1.7957 0 0

3.1700 0 1.38 3.1700

0 0 3.1700 2.4889

-ln ( Pi,j)Pi,j

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

0.2980 0.2065 0.1331 0

0.2065 0.2980 0 0

0.1331 0 0.3465 0.1331

0 0 0.1331 0.2065

=

Page 38: PR-lecture 6-7

0.2980 0.2065 0.1331 0

0.2065 0.2980 0 0

Entropy = sum of product

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

0.1331 0 0.3465 0.1331

0 0 0.1331 0.2065

sum of multiplication results matrix = 2.0951

Page 39: PR-lecture 6-7

GLCM MeanGLCM Mean• GLCM Mean Equations

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

Page 40: PR-lecture 6-7

• The left hand equation calculates themean based on the reference pixels, µi. It is also possible to calculate the mean using the neighbor pixels, µj, as in the right hand equation.

• For the symmetrical GLCM , the two

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

• For the symmetrical GLCM , the two values µi and µj are identical.

Page 41: PR-lecture 6-7

• Exercise:

• For test image,• Calculate the mean of symmetrical

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

• Calculate the mean of symmetrical horizontal and Vertical GLCM.

Page 42: PR-lecture 6-7

0.166 0.083 0.042 0

Normalized symmetrical horizontal GLCM matrix

(Pij)

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

0.083 .166 0 0

0.042 0 0.25 0.042

0 0 .042 0.083

Page 43: PR-lecture 6-7

• for the test image, calculate the Mean of symmetrical horizontal GLCM :

• µi = 0* ( 0.166 + 0.083 + 0.042 + 0 ) + 1* ( 0.083 + 0.166 + 0 + 0 ) + 2* ( 0.042 + 0 + 0.250 + 0.042) +

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

2* ( 0.042 + 0 + 0.250 + 0.042) + 3* ( 0 + 0 + 0.042 + 0.083)

• = 0.249 + 2(0.334) + 3(0.125)• = 0.249 + 0.668 + 0.375

• = 1.292 = µj

Page 44: PR-lecture 6-7

Normalized symmetrical Vertical GLCM matrix

(Pi,j)

0.250 0 0.083 0

0 0.167 .083 0

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

0 0.167 .083 0

0.083 0.083 0.083 0.083

0 0 0.083 0

Page 45: PR-lecture 6-7

• for the test image, calculate the Mean of symmetrical vertical GLCM

• µi = 0*(0.250 + 0 + .083+ 0 ) + 1*( 0 + 0.167 + 0.083 + 0 ) + 2*(0.083 + 0.083 + 0.083 + 0.083) +

• 3*( 0 + 0 + 0.083 + 0 )

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

• 3*( 0 + 0 + 0.083 + 0 )• =1.162 = µj

Page 46: PR-lecture 6-7

• the mean for the original values in the window (not the GLCM mean) is 1.25.

0 0 1 1

0 0 1 1

0 2 2 2

2 2 3 3

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

• This clearly demonstrates that the GLCM Mean and the "ordinary" mean are not the same measure.

2 2 3 3

Page 47: PR-lecture 6-7

• The "ordinary mean would be a first-order "texture" measure,

• it is difficult to see how it could be called texture in any practical sense.

• The first-order standard deviation, however, is commonly used as a texture

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

however, is commonly used as a texture measure.

• remember, all GLCM texture is second-order (concerning the relationship between two pixels).

Page 48: PR-lecture 6-7

• Exercise: Calculate the Variance texture for both the horizontal and vertical GLCM of the test image.

• GLCM Variance (GLCM Standard Deviation)

Variance equation =

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

Variance equation =

Page 49: PR-lecture 6-7

• Standard deviation equation

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

Page 50: PR-lecture 6-7

• Properties of Variance • Variance is a measure of the dispersion of

the values around the mean.

• It is similar to entropy. It answers the question "What is the dispersion of the

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

question "What is the dispersion of the difference between the reference and the neighbor pixels in this window?"

Page 51: PR-lecture 6-7

• Exercise: Calculate the Variance texture for both the horizontal and vertical GLCM of the test image.

• µi= 1.292

0.166 0.083 0.042 00.083 .166 0 00.042 0 0.25 0.042

0 0 .042 0.083

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

• Variance (horizontal) =• 0.166(0-1.292)2 + 0.083(0-1.292)2 + 0.042(0-1.292)2 + 0• + 0.083(1-1.292)2 + 0.166((1-1.292)2 + 0 + 0 • + 0.042(2-1.292)2 + 0 + .250(2-1.292)2 + .042(2-

1.292)2• + 0 + 0 + .042(3-1.292)2 + .083(3-

1.292)2

• = 1.039067

Page 52: PR-lecture 6-7

• Variance (vertical) = • 0.250(0 -1.162)2 + 0 + .083(0 -1.162)2 + 0 +

0.250 0 0.083 00 0.167 .083 0

0.083 0.083 0.083 0.0830 0 0.083 0

µj= 1.162

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

• 0.250(0 -1.162) + 0 + .083(0 -1.162) + 0 + • 0 + 0.166(1 -1.162)2 + .083(1 -1.162)2 + 0 + • 0.083(2 -1.162)2 + 0.083(2 -1.162)2 + 0.083(2 -1.162)2 + .083(2-

1.162)2

• + 0 + 0 + 0.083(3 -1.162)2 + 0

• =0.969705• Variance calculated on the original image values rather than on the

GLCM = 1.030776

Page 53: PR-lecture 6-7

• GLCM Correlation• The Correlation texture measures the

linear dependency of grey levels on those of neighboring pixels.

• GLCM Correlation equation:

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

Page 54: PR-lecture 6-7

• Exercise: Calculate the Correlation measure for the horizontal test image.

• this is easier if you do the GLCM Meanand Variance exercises first and use their results

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

results

Page 55: PR-lecture 6-7

• Correlation (horizontal GLCM):

• Let A = 1/ [(1.039067)(1.039067)] .5

0.166 (0-1.292) (0-1.292)A + 0.083(0-1.292)(1-1.292)A+ 0.042(0-1.292)(2-1.292)A + 0 + 0.083 (1-1.292)(0-1.292) A + 0 .166 (1-1.292)(1-1.292)A + 0+ 0 + 0.042(2-1.292)(0-1.292)A + 0

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

+ 0.042(2-1.292)(0-1.292)A + 0+ 0.250(2-1.292)(2-1.292)A + 0 .042 (2-1.292)(3-1.292)A + 0 + 0 + 0.042 (3-1.292)(2-1.292)A +.083 (3-1.292)(3-1.292)A =

= 0.7182362

Page 56: PR-lecture 6-7

• Creating a texture image• The result of a texture calculation is a

single number representing the entire window.

• This number is put in the place of the centre pixel of the window, then the window is moved one pixel and the

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

window is moved one pixel and the process is repeated of calculating a new GLCM and a new texture measure. In this way an entire image is built up of texture values

Page 57: PR-lecture 6-7

• Example: For a 5x5 window,

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

Page 58: PR-lecture 6-7

الس1م عليكم و رحمة هللا و بركاتهالس1م عليكم و رحمة هللا و بركاته

Lecture 6, lecture 7 Professor Sayed Fadel Bahgat Pattern Recognition

الس1م عليكم و رحمة هللا و بركاتهالس1م عليكم و رحمة هللا و بركاته