aco1

6
Ant colony Optimization for Feature Selection and Classification of Microcalcifications in Digital Mammograms M.Karnant K.Thangavel2 R.Sivakuar3 K.Geetha3 'Department of Computer science, Gandhigram Deemed University, Dindugul., INDIA 2Department of Computer Science, Periyar University, Salem INDIA 3Hindusthan College ofEngg and Information Technology, Coimbatore, INDIA Email: [email protected]. Abstract Genetic Algorithm (GA) and Ant Colony Optimization (ACO) algorithm are proposed for feature selection, and their performance is compared. The Spatial Gray Level Dependence Method (SGLDM) is used for feature extraction. The selected features are fed to a three-layer Backpropagation Network hybrid with Ant Colony Optimization (BPN-ACO) for classification. And the Receiver Operating Characteristic (ROC) analysis is performed to evaluate the performance of the feature selection methods with their classification results. The proposed algorithms are tested with 114 abnormal images from the Mammography Image Analysis Society (MIAS) database. Keywords: Feature Extraction, Feature Selection, Genetic Algorithm, Ant Colony Optimization, Backpropagation Network and Receiver Operating Characteristic. 1. Introduction In many western countries breast cancer is the most common form of cancer among women. The World Health Organization's International agency for Research on Cancer, estimates that more than 150 000 women worldwide die of breast cancer each year. The breast cancer is one among the top three cancers in American women. In United States, the American Cancer Society estimates that, 215 990 new cases of breast carcinoma has been diagnosed, in 2004. It is the leading cause of death due to cancer in women under the age of 65 [2]. Clear evidence shows that early diagnosis and treatment of breast cancer can significantly increase the chance of survival for patients. One of the most important radiological signs for early detection of breast cancer is the presence and appearance of microcalcifications. These are small regions of elevated intensity against the varying background density of the X-ray mammogram [14]. Thangavel et al., [27] presented a good review on various methods for detection of microcalcifications. It is of crucial importance to design the classification method in such a way to obtain a high level of True-Positive Fraction (TPF) while maintaining the False-Positive Fraction (FPF) at its minimum level. One of the most important steps for the classification task is extracting suitable features capable of distinguishing between classes. There have been great efforts spent on extracting appropriate features from microcalcification clusters [3,5,21,22]. One set of features used for microcalcification classification is texture features. Here, the texture features are extracted using Spatial Gray Level Dependence Method (SGLDM) from the segmented mammogram image [18]. In order to reduce the complexity and to increase the performance of the classifier the redundant and irrelevant features are reduced from the original feature set. In this paper, GA and ACO algorithms are proposed to select the optimal features from the original feature set. Only the optimal features are inputted to the classifier for classification of microcalcifications. The following section presents an overview of the work. 1.1 Overview of the CAD System In this paper, the classification is performed in four steps. Initially the mammogram image is smoothened using median filter and the pectoral muscle region is removed from the breast region. Next, the Ant Colony Optimization (ACO) algorithm hybrid with Markov Random Filed (MRF) method is used to segment the microcalcifications from the enhanced mammogram image. In the second step, the co- occurrence matrix is generated to extract the texture features from the segmented image. The Haralick features are extracted from the co- occurrence matrix [12]. There are totally 14 1-4244-0716-8/06/$20.00 ©2006 IEEE. 298 Authorized licensed use limited to: UNIVERSITI UTARA MALAYSIA. Downloaded on May 11,2010 at 03:29:31 UTC from IEEE Xplore. Restrictions apply.

Upload: ey-ma

Post on 23-Jun-2015

99 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: aco1

Ant colony Optimization for Feature Selection and Classificationof Microcalcifications in Digital Mammograms

M.Karnant K.Thangavel2 R.Sivakuar3 K.Geetha3

'Department ofComputer science, Gandhigram Deemed University, Dindugul., INDIA2Department ofComputer Science, Periyar University, Salem INDIA

3Hindusthan College ofEngg and Information Technology, Coimbatore, INDIAEmail: [email protected].

AbstractGenetic Algorithm (GA) and Ant ColonyOptimization (ACO) algorithm are proposed forfeature selection, and their performance iscompared. The Spatial Gray Level DependenceMethod (SGLDM) is used for feature extraction.The selected features are fed to a three-layerBackpropagation Network hybrid with Ant ColonyOptimization (BPN-ACO) for classification. Andthe Receiver Operating Characteristic (ROC)analysis is performed to evaluate the performanceof the feature selection methods with theirclassification results. The proposed algorithms aretested with 114 abnormal images from theMammography Image Analysis Society (MIAS)database.

Keywords: Feature Extraction, Feature Selection,Genetic Algorithm, Ant Colony Optimization,Backpropagation Network and Receiver OperatingCharacteristic.

1. IntroductionIn many western countries breast cancer is themost common form of cancer among women. TheWorld Health Organization's International agencyfor Research on Cancer, estimates that more than150 000 women worldwide die of breast cancereach year. The breast cancer is one among the topthree cancers in American women. In UnitedStates, the American Cancer Society estimatesthat, 215 990 new cases of breast carcinoma hasbeen diagnosed, in 2004. It is the leading cause ofdeath due to cancer in women under the age of 65[2]. Clear evidence shows that early diagnosis andtreatment of breast cancer can significantlyincrease the chance of survival for patients. One ofthe most important radiological signs for earlydetection of breast cancer is the presence andappearance of microcalcifications. These are smallregions of elevated intensity against the varyingbackground density of the X-ray mammogram

[14]. Thangavel et al., [27] presented a goodreview on various methods for detection ofmicrocalcifications. It is of crucial importance todesign the classification method in such a way toobtain a high level of True-Positive Fraction (TPF)while maintaining the False-Positive Fraction(FPF) at its minimum level.

One of the most important steps for theclassification task is extracting suitable featurescapable of distinguishing between classes. Therehave been great efforts spent on extractingappropriate features from microcalcificationclusters [3,5,21,22]. One set of features used formicrocalcification classification is texture features.Here, the texture features are extracted usingSpatial Gray Level Dependence Method(SGLDM) from the segmented mammogramimage [18]. In order to reduce the complexity andto increase the performance of the classifier theredundant and irrelevant features are reduced fromthe original feature set. In this paper, GA and ACOalgorithms are proposed to select the optimalfeatures from the original feature set. Only theoptimal features are inputted to the classifier forclassification of microcalcifications. The followingsection presents an overview of the work.

1.1 Overview of the CAD SystemIn this paper, the classification is performed infour steps. Initially the mammogram image issmoothened using median filter and the pectoralmuscle region is removed from the breast region.Next, the Ant Colony Optimization (ACO)algorithm hybrid with Markov Random Filed(MRF) method is used to segment themicrocalcifications from the enhancedmammogram image. In the second step, the co-occurrence matrix is generated to extract thetexture features from the segmented image. TheHaralick features are extracted from the co-occurrence matrix [12]. There are totally 14

1-4244-0716-8/06/$20.00 ©2006 IEEE. 298

Authorized licensed use limited to: UNIVERSITI UTARA MALAYSIA. Downloaded on May 11,2010 at 03:29:31 UTC from IEEE Xplore. Restrictions apply.

Page 2: aco1

features for each image. These 14 features aregrouped into 4 categories respective to theircharacteristics. In the third step, GeneticAlgorithm (GA) and Ant Colony Optimization(ACO) algorithm are proposed to select theoptimal features from each group of feature set.The optimal features selected from the originalfeature set are considered for classification. In thefourth step, the selected features are fed to theBackpropagation Network for classification. Theweights for the neurons in BPN are extracted andupdated using Ant Colony Optimization algorithm.The performances of the reduction algorithms areevaluated using the classification results bygenerating the ROC curve using the validationmethods such as Jack Knife method, Round RobinMethod and Ten Fold validation method. The restof the paper is organized as follows: the followingsection presents a brief discussion on segmentationof microcalcifications. Section 3 describes thefeature extraction method. Section 4 focuses onfeature selection algorithms such as GeneticAlgorithm and Ant Colony Optimizationalgorithm. The classification is performed insection 5. The results and performance analysis ispresented in section 6. And conclusions are givenin section 7.

2. Segmentation of MicrocalcificationsBefore extracting the texture features,microcalcifications should be segmented from thebackground of the mammographic images [25]. Inthis paper, microcalcifications are segmented usingAnt Colony Optimization (ACO) algorithm hybridwith Markov Random Field (MRF) as described in[26]. The segmentation process with this methodconsists of three steps. The first step enhances themammogram image using median filtering and thepectoral muscle region is removed from the breastregion. In the second step, the cliques havingsimilar arrangements of pixels are assigned withunique label. And the Maximizing a Posterior(MAP) function value is estimated for each cliqueusing MRF [16,18,23]. Where, clique is a windowof neighborhood pixels with 3x3 size. Thesegmentation is performed with the optimum label,which minimizes the MAP estimate. The ACOalgorithm [20] is implemented to find out theoptimum label in the third step. The intensityvalue, lopt of the pixel that generates the optimumlabel is traced. The pixels having the intensityvalue which is equal to lopt or higher are extractedfrom the original image to create the segmentedimage. This segmented image is used in the nextstep for extracting the texture features.

3. Feature ExtractionThe texture of images refers to the appearance,structure and arrangement of the parts of an objectwithin the image. Images used for diagnosticpurposes in clinical practice are digital. A twodimensional digital image is made up of littlerectangular blocks or pixels (picture elements)each is represented by a set of coordinates inspace, and each has a value, representing the gray-level intensity of that picture element in space. Afeature value is a real number, which encodessome discriminatory information about a propertyof an object. In this paper, the Spatial Gray LevelDependence Method is used to extract the featuresfrom the segmented mammogram image.

3.1 Spatial Gray Level Dependence MethodIn this method, co-occurrence matrix is generatedto extract the texture features from the segmentedmammogram image. There may be many co-occurrence matrices computed for a single image,one for each pair of distances and directionsdefined. Normally a set of 20 co-occurrencematrices are computed, for five different distances,in the horizontal, vertical, and two diagonaldirections i.e., the distances are 1, 3, 5, 7 and 9,and the four angles 00, 450, 900, and 1350 weredefined for calculating the matrix for each of thefive distances. Since the co-occurrence matrixanalyzes the gray-level distribution of pairs ofpixels, it is also known as the second-orderhistogram. The estimated joint conditionalprobability density functions are defined asfollows [12].The features computed from the co-occurrencematrix are Angular Second Moment (ASM),Contrast (CON), Correlation (COR), Variance(VAR), Inverse Difference Moment (IDM), SumAverage (SA), Sum Variance (SV), Sum Entropy(SE), Entropy (ENT), Difference Variance (DV),Difference Entropy (DE), Information Measure ofCorrelation I (IMC 1), Information Measure ofCorrelation II (IMC2) and Maximal CorrelationCoefficient (MCC). The features based on the co-occurrence matrices should capture somecharacteristics of textures, such as homogeneity,coarseness, periodicity and others. The 14 featurescan be put into four groups [10].

4. Feature SelectionFeature selection is meant here to refer to theproblem of dimensionality reduction of data,which initially contain a high number of features.One hope to choose optimal subsets of the originalfeatures which still contain the information

299

Authorized licensed use limited to: UNIVERSITI UTARA MALAYSIA. Downloaded on May 11,2010 at 03:29:31 UTC from IEEE Xplore. Restrictions apply.

Page 3: aco1

essential for the classification task, while reducingthe computational burden imposed by using manyfeatures [17]. In this paper, the Genetic Algorithm(GA) and Ant Colony Optimization (ACO)algorithms are proposed for feature selection.

4.1 Feature Selection Using GeneticAlgorithmA GA is a heuristic search or optimizationtechnique for obtaining the best possible solutionin a vast solution space[8.9]. In this paper, totally20 co-occurrence matrices are created for eachimage for each pair of distances and directionsdefined. The Haralick features are extracted for allthe 114 images. And the features are grouped intofour categories as discussed in the earlier section.A single feature value for all the images areconsidered as initial population for geneticalgorithm. An optimum value is found out for eachindividual feature set. In a group, the optimumvalue from each individual set is compared; theone, which is optimum among other features in thesame group, is selected for classification. Like this,for every group an optimum feature is selected.Finally, the algorithm returns four optimumfeatures from the set of 14 features. Only theselected features are considered for classification.The features selected from this Genetic algorithmare ASM, VAR, ENT and IMC2.

4.2 Feature Selection Using Ant ColonyOptimization AlgorithmAnt algorithms were first proposed by Dorigo andcolleagues [5] as a multi-agent approach todifficult combinatorial optimization problems suchas the Traveling Salesman Problem (TSP) and theQuadratic Assignment Problem (QAP). There iscurrently much ongoing activity in the scientificcommunity to extend and apply ant-basedalgorithms to many different discrete optimizationproblems. Recent applications cover problemssuch as vehicle routing, sequential ordering, graphcoloring, network routing, and so on.

Ant algorithms were inspired by the observation ofreal ant colonies. Ants are social insects, that is,insects that live in colonies and whose behavior isdirected more to the survival of the colony as awhole than to that of a single individualcomponent of the colony. Social insects havecaptured the attention of many scientists becauseof the high structuration level their colonies canachieve, especially when compared to the relativesimplicity of the colony's individuals. Animportant and interesting behavior of ant colonies

is their foraging behavior, and, in particular, howants can find the shortest paths between foodsources and their nest. While walking from foodsources to the nest and vice versa, ants deposit onthe ground a substance called pheromone, formingin this way a pheromone trail. Ants can smellpheromone, and when choosing their way, theytend to choose, in probability, paths marked bystrong pheromone concentrations. The pheromonetrail allows the ants to find their way back to thefood source (or to the nest). Also, other ants to findthe location of the food sources found by theirnestmates can use it.It has been shown experimentally that thispheromone trail following behavior can give rise,once employed by a colony of ants, to theemergence of shortest paths. That is, when morepaths are available from the nest to a food source,a colony of ants may be able to exploit thepheromone trails left by the individual ants todiscover the shortest path from the nest to the foodsource and back [4,6,7].

In ACO algorithms a finite-size colony ofartificial ants with the above-describedcharacteristics collectively searches for good-quality solutions to the optimization problemunder consideration. Each ant builds a solution, ora component of it, starting from an initial stateselected according to some problem-dependentcriteria. While building its own solution, each antcollects information on the problem characteristicsand on its own performance and uses thisinformation to modify the representation of theproblem, as seen by the other ants. Ants can actconcurrently and independently, showing acooperative behavior. They do not use directcommunication: It is the stigmergy paradigm thatgoverns the information exchange among the ants.

In our proposed algorithm, the normalizedindividual feature set is considered as solutionspace for ACO search. Each feature value islabeled with a number corresponding to its fitnessvalue, calculated using the fitness function(1/1+Pi), where Pi is the feature value. A solutionmatrix is created with the feature values, fitnessvalues and their corresponding labels. Initially, Knumber of ants are start their search from arandomly selected feature value, with the initialpheromone of To. A random number q isgenerated and compared with qo, if q < qo, then thecorresponding feature value is assigned withmaximum label from the label set. Otherwise, arandom label is assigned for that feature value.This step is repeated for each ant and for each

300

Authorized licensed use limited to: UNIVERSITI UTARA MALAYSIA. Downloaded on May 11,2010 at 03:29:31 UTC from IEEE Xplore. Restrictions apply.

Page 4: aco1

feature value in the feature set. Once all the antsare created their solution, the pheromone of theants having optimum label are locally updated asdefined below.

Tnew= (1 -p) * Told + p * To, (1)

where Told and Tnew are the old and newpheromone values, and p is rate of pheromoneevaporation parameter in local update, ranges from[0,1] i.e., 0 < p < 1. Then the fitness values of allthe ants are locally improved by replacing with themaximum fitness value. The maximum fitnessvalue is searched from the set of fitness values ofthe features having optimum label. The optimumfitness value is selected from the set of fitnessvalues from the set of solutions created by all theants. This value is known as local minimum(Lmin), if this value is greater than globalminimum (Gmin), then Gmin is assigned withLmin. The ant, which generates the Gmin, isglobally updated as:

Tnew (1:-o) Told + o * ATold, (2)

where oc is rate of pheromone evaporationparameter in global update called as track'srelative importance, ranges from [0,1] i.e., 0 < a <1, and A is equal to ( 1 / Gmin). For the remainingants their pheromone is updated as: Tnew= (1 - cc)* Told, here, the A is assumed as 0. Thus thepheromones are updated globally.

At the final iteration, the Gmin has the label ofthe optimum feature. To further enhance the value,this entire procedure can be repeated for number oftimes. Similar to the previous method, theoptimum feature is selected from each group andonly those selected features are further used in theclassification. As a result the ASM, IDM, ENTand IMC2 are the selected features from thisalgorithm.

5. Back Propagation Network ClassifierHybrid with Ant Colony OptimizationAlgorithmNeural networks (NN) can learn various tasksfrom training examples; classify phenomena, andmodel nonlinear relationships. However, theprimary features that are of concern in the designof the network are problem specific. Despite theavailability of some guidelines, it would be helpfulto have a computational procedure in this aspect,especially for the optimum design of an NN. Thegradient descent algorithms have reported

difficulties in learning the topology of thenetworks whose weights they optimize. In thispaper, the Backpropagation Neural Network ishybrid with the Ant Colony OptimizationAlgorithm is used for learning.

Backpropagation is a learning algorithm for multi-layered feed forward networks that uses thesigmoid function. The backpropagation neuralnetwork optimizes the net for correct responses tothe training input data set. More than one hiddenlayer may be beneficial for some applications, butone hidden layer is sufficient if enough hiddenneurons are used [1,13,15].

In the backpropagation algorithm error function iscalculated after the presentation of each input andthe error is propagated back through the networkmodifying the weights before the presentation ofthe next pattern. This error function is usually theMean Square Error (MSE) of the differencebetween the desired and the actual responses of thenetwork over all the output units. Then the newweights remain fixed and a new image is presentedto the network and this process continuous until allthe images have been presented to the network.The presentation of all the patterns is usuallycalled one epoch or single iteration. In practicemany epochs are needed before the error becomesacceptably small. The number of hidden neurons isequal to the number of input neurons. And onlyone output neuron. Initial weights are extractedusing the ACO algorithm as follows:

In weight extraction, N random numbers aregenerated with d number of digits. Where, N is thetotal number of neurons in the BPN. The weightsare extracted from the population of randomnumbers to determine the fitness values. Theactual weight wk is given by:Wk= {C* [ Xkd+2 10d +Xkd+3 10d+ * * +X(k+ l)d]} 10d (3)where c=1, if 5 < xkd+l < 9, else c=-1, and krepresents the population. The weights areextracted for each string in the population. Thefitness values is calculated as defined below:

F =1 ,/E (4)Where, E = sqrt([ E + E2+ ... Em ] / m),Where, m - is the total number of trainingpatterns, and E1, E2 .... Em are the errors for eachpattern, i.e., Ei = (Ti- Oi)2, where Ti is the desiredoutput, and 0i is the actual result of the outputlayer.

Thus the fitness value is calculated for a singlepopulation. Like this M populations are generatedand their fitness values are calculated. Then

301

Authorized licensed use limited to: UNIVERSITI UTARA MALAYSIA. Downloaded on May 11,2010 at 03:29:31 UTC from IEEE Xplore. Restrictions apply.

Page 5: aco1

optimum fitness value is selected using ACOalgorithm.

The procedure for finding the optimum valueis similar to the algorithm as described in section4.2. All the minimum fitness values are replacedwith the maximum fitness values. Now the weightsare updated with the new fitness value and thetraining is performed again. This procedure isrepeated until the error from the backpropagationnetwork is less than the tolerance value.

The output from the each hidden neuron iscalculated using the sigmoid function, SI = 1/(l+e-XX), where k=1, and x =i wih ki, where wih is theweight assigned between input and hidden layer,and k is the input value. The output from theoutput layer is calculated using the sigmoidfunction, S2 = 1 / (I + eXX), where k=1, and x = Eiwho Si, where who is the weight assigned betweenhidden and output layer, and Si is the output valuefrom hidden neurons. The network is trained toproduce a 0.9 output value for malignant and 0.1output value for benign.

0.2 0.4 0.6 U.SePst F Twtio.

(a)

6. Receiver Operating CharacteristicAnalysisThe receiver operating characteristic (ROC) curveis a popular tool in medical and imaging research.It conveniently displays diagnostic accuracyexpressed in terms of sensitivity (or true-positiverate) against (1-specificity) (or false-positive rate)at all possible threshold values. Performance ofeach test is characterized in terms of its ability toidentify true positives while rejecting falsepositives [24].

7. Results and DiscussionThe images used in this work were taken from theMammography Image Analysis Society (MIAS)(2003). The database consisting of 320 images,which belong to three normal categories: normal,benign and malign. There are 206 normal images,63 benign and 51 malign. In this paper, only thebenign and malign images are considered forfeature extraction. All the images also specify thelocations of any abnormities that may be present..The classification results of the backpropagationneural network is tested by using a jack-knifemethod [28], round-robin method [19], and tenfold validation method. The results were analyzedby using ROC curve [11].

0.2 0.4 0.6 0.81.2 0.4 0.6 0U8 1F-e.Faion False Positive Fraction

CIVhh (c)KU)

Figurel. ROC Curves for Feature Selection methods, GA and ACS based on(a) Jack-Knife Method, (b) Round Robin Method and (c) Ten-fold validation method

Figure 1(a) shows the ROC curve generated forGA and ACS based on jack knife method. The A,value for GA is 0.85 and 0.92 for ACO. Figurel(b) shows the ROC curve generated for GA andACS based on round robin method. The A, valuefor GA is 0.86 and 0.94 for ACO. Figure l(c)shows the ROC curve generated for GA and ACSbased on ten-fold validation method. The A, valuefor GA is 0.84 and 0.93 for ACO.

8. ConclusionIn this paper, SGLDM is used to extract theHaralick features from the segmentedmammogram image. And the features are groupedinto four categories based on visual texture

characteristics, statistics, information theory andinformation measures of correlation. GeneticAlgorithm and Ant Colony Optimizationalgorithms are proposed for feature selection. Eachalgorithm is selecting the optimum feature fromeach group and the selected features are consideredfor classification. A three-layer BackpropagationNeural Network hybrid with Ant ColonyOptimization algorithm is used for classification.The ACO algorithm is used for weight extractionwhile learning. ROC analysis is performed tocompare the classification results of the featureselection algorithms. The results show that theACO algorithm selects better features than GA.

302

Authorized licensed use limited to: UNIVERSITI UTARA MALAYSIA. Downloaded on May 11,2010 at 03:29:31 UTC from IEEE Xplore. Restrictions apply.

Page 6: aco1

References[1] Arbach.L., Bennett. D.L., Reinhardt. D. L., and

Fallouh. D.L.: "Distinguishing betweenmalignant and non-malignant breast masses frommammograms using Al techniques". Basel Al-Assad Journalfor Engineering Sci. 03-121, 2002.

[2] Detounis. S.: "Computer-Aided Detection andSecond Reading Utility and Implementation in aHigh-Volume Breast Clinic". Appl. Radiology,8-15, 2004.

[3] Dhawan . A.P., Chitre. Y. and Kaiser-Bonaso.C.: "Analysis of mammographicmicrocalcifications using gray-level imagestructure features". IEEE Trans. Med. Imag.,15(3): 246-259, 1996.

[4] Dorigo. M., Gambardella. L.M.: "Ant coloniesfor the traveling salesman problem". BioSystems,43. 73-81, 1997.

[5] Dorigo. M., Maniezzo. V., Colorni. A.:"Positive feed back as a search strategy"Technical Report (16) Politecnico di Milano,Italy. 1991.

[6] Dorigo. M., Maniezzo. V., Colorni. A.: "The AntSystem:Optimization by a Colony ofCooperating Agents" IEEE Transactions onSystems, Man and Cybernetics-Part B, 1(26)(1996) 29-41.

[7] Dorigo. M., Stuztle. T.: "Ant ColonyOptimization" PHI ed, 2005.

[8] Goldberg. D. E.: "Genetic Algorithms in Search,Optimization and Machine Learning" AddisonWesley Longman Pte. Ltd., 3rd ed., 60-68, 2000.

[9] Gudmundsson. M., El-Kwae. E. A., and Kabuka.M. R. : "Edge Detection in Medical ImagesUsing a Genetic Algorithm" IEEE Transactionson Medical Imaging, 17(3):469 - 474, 1998.

[10] Gulsrud. T. O.: "Texture analysis of DigitalMammograms" Ph.D. Thesis, 30-32, 2000.

[11] Jean. H., David. M.C., Charles. F.B., Zygmunt.P. and Edward. J.D.: "Preclinical ROC studiesof digital stereomammography" IEEE Trans.Med. Imag., 14(2):318-327, 1995.

[12] Haralick. R., Shanmugam. K., and Dinstein. I.:"Textural features for image classification" IEEETrans. Syst., Man, Cyb., 3: 610-621, 1973.

[13] Herz. J., Krogh. A., and Palmer. R."Introduction to the Theory of NeuralComputation" Reading, MA. Addison- Wesley,1991.

[14] Highnam. R. and Brady. M.: "MammographicImage Analysis", 1st ed, Dordrecht: KluwerAcademic, 1999.

[15] Karssemeijer. N.: "Automated classification ofparenchymal patterns in mammograms". Phys.Med. Biol., 43(2):365-378, 1998.

[16] Kervrann. C. and Heitz. F.: "A Markov RandomField Model-Based Approach to UnsupervisedTexture Segmentation Using Local and GlobalStatistics" IEEE Trans. on Image Processing,4(6):856 - 862, 1995.

[17] Kopans. D.B.: "Positive predictive value ofmammography", American Journal ofRadiology, 158: 521-526, 1992.

[18] Li. H. D., Kallergi. M., Clarke. L. P., Jain. V. K.and Clark. R. A.: "Markov Random Field forTumor Detection in Digital Mammography"IEEE Transactions on Medical Imaging,14(3):565 - 576, 1995.

[19] Nadler. M. and Smith. E.P.: "Pattern RecognitionEngineering " New York. Wiley, 1993.

[20] Ouadfel. S. and Batouche. M.: "MRF-basedimage segmentation using Ant Colony System"Elec. Letters on Comp. Vis. and Image Analysis,2(2):12-24, 2003.

[21] Raflee-Rad. F., Soltanian-Zadeh. H., Rahmati.M. and Pour-Abdollah. S.: "Microcalcificationclassification in mammograms using waveletfeatures"Proc. SPIE 3813:832-841, 1999.

[22] Shen. L., Rangayyan. R. M. and J. Desautels. E.L.: "Application of shape analysis tomammographic calcifications" IEEE Trans. Med.Imag., 13(2):263-274, 1994.

[23] Speis. A. and Healey. G.: "An Analytical andExperimental Study of the Performance ofMarkov Random Fields Applied to TexturedImages Using Small Samples"IEEE Transactionson Image Processing, 5(3):447-458, 1996.

[24] Streiner.D.L. and Norman. G.R.: "HealthMeasurement Scales ". 2nd edn. OxfordUniversity Press, Oxford, 1995.

[25] Thangavel. K., Karnan. M.: "Computer AidedDiagnosis in Digital Mammograms: Detection ofMicrocalcifications by Meta HeuristicAlgorithms" International Journal on GraphicsVision andImage Processing, v. 7, 2005.

[26] Thangavel. K., Karnan. M, Sivakumar. R. andKaja Mohideen. A.: "Segmentation andClassification of Microcalcification inMammograms Using the Ant Colony System"International Journal on Artificial Intelligence inMachine Learning.

[27] Thangavel. K., Karnan. M., Siva Kumar. R., andKaja Mohideen. A.: "Automatic Detection ofMicrocalcification in Mammograms-A Review"International Journal on Graphics Vision andImage Processing, 5(5):31-61, 2005.

[28] Wu. Y., Doi. K., Giger. M. L., and Nishikawa. R.M.: "Computerized detection of clusteredmicrocalcifications in digital mammograms:Application of artificial neural networks" Med.Phys., 19(3):555-560, 1992.

303

Authorized licensed use limited to: UNIVERSITI UTARA MALAYSIA. Downloaded on May 11,2010 at 03:29:31 UTC from IEEE Xplore. Restrictions apply.