Comparison of Classification Algorithms for Machine Learning Based Network Intrusion Detection Systems

Download Comparison of Classification Algorithms for Machine Learning Based Network Intrusion Detection Systems

Post on 06-Aug-2015

145 views

Category:

Documents

3 download

Embed Size (px)

TRANSCRIPT

<p>22 December 2012</p> <p>1</p> <p>COMPARISON OF CLASSIFICATION ALGORITHMS FOR MACHINE LEARNING BASED NETWORK INTRUSION DETECTION SYSTEMSDr. Srinivasa K. G., P. N. Pavan KumarM. S. Ramaiah Institute of Technology, Bangalore, India</p> <p>Presented at: International Conference on Electrical, Communication and Information Technologies, ICECIT -2012</p> <p>22 December 2012</p> <p>2</p> <p>Network Intrusion Detection Systems (NIDS) Data stored under networked systems comes under huge</p> <p>risk because of increasing network attacks. NIDS plays a prominent role in any modern security system. Traditional IDS are signature-based systems. They cannot identify novel attacks. Newer attacks whose signature have not been identified are those which prove to be a bigger security threat than the existing ones.</p> <p>22 December 2012</p> <p>3</p> <p>Machine Learning in NIDS Based on the model that is generated out a standard</p> <p>dataset, the attacks can be classified from regular network patterns. Machine Learning algorithms:1. 2.</p> <p>Clustering Algorithms Classification Algorithms</p> <p> This paper compares the performance of classification</p> <p>algorithms for Machine Learning based NIDS.</p> <p>22 December 2012</p> <p>4</p> <p>Classification Algorithms1. Bayesian Learning:Computation of posterior probability distribution. Combining prior knowledge and observed data. [4] </p> <p>Bayesian Network Nave Bayes Classifier</p> <p>2. Meta Learning:Uses experience to change certain aspects of a learning algorithm, or the learning method itself, such that the modified learner is better than the original learner is, at learning from additional experience. [6]</p> <p>AdaBoostM1 Rotation Forest</p> <p>22 December 2012</p> <p>5</p> <p>Classification Algorithms (contd.)3. Rule Based Learning: </p> <p>These learning algorithms use a standard set of nested conditions for learning and has a hierarchical arrangement of conditions.</p> <p>Decision Table NNge Classifier</p> <p>4. Decision Tree Learning: </p> <p>A decision tree is a classier depicted in a owchart like tree structure. Comprehensive in nature and resembles human reasoning. [7]</p> <p>J48 NBTree Random Forest Random Tree REP Tree Simple CART</p> <p>22 December 2012</p> <p>6</p> <p>Classification Algorithms (contd.)5. Classifier Functions: Support Vector Machine Multilayer Perceptron Radial Basis Function Network 6. Lazy learning algorithms:Doesn't construct a model when training, only when classifying new instances.</p> <p>KStar</p> <p>22 December 2012</p> <p>7</p> <p>Comparison of Classification AlgorithmsBased on 3 criteria:1. Accuracy of the models generated2. Time taken for model generation 3. Memory performance during model generation and</p> <p>evaluation</p> <p>Network Data</p> <p>Pre Processing</p> <p>Conversion to ARFF</p> <p>NSL-KDD Training Data</p> <p>Classifier Algorithm</p> <p>NIDS Model</p> <p>Testing Output</p> <p>NSL-KDD Test Data</p> <p>Figure 1: Block Diagram of Experimental Setup</p> <p>22 December 2012</p> <p>8</p> <p>Dataset Used: NSL- KDDThe NSL-KDD data set [14] has the following advantages over the original KDD data set: Does not include redundant records in the train set. There is no duplicate records in the proposed test sets. The number of selected records from each difficulty level group is</p> <p>inversely proportional to the percentage of records in the original KDD data set. The number of records in the train and test sets are reasonable. Consequently, evaluation results of different research works will be consistent and comparable.</p> <p>22 December 2012</p> <p>9</p> <p>Sofware Used: Weka Waikato Environment for Knowledge Analysis. Weka supports several standard data mining tasks. Written in Java. Available for free under GPU license. Developed at the University of Waikato, New Zealand. Attribute-Relation File Format (ARFF).</p> <p>22 December 2012</p> <p>10</p> <p>Results: Accuracy of models generatedClassBayes Meta Rule</p> <p>AlgorithmBayesNetwork NaiveBayes AdaboostM1 RotationForest DecisionTable</p> <p>Accuracy (%)74.4322 76.1178 78.4422 77.4885 72.5958</p> <p>Nnge J48NBTree Tree RandomForest RandomTree REPTree SimpleCART</p> <p>78.9168 81.533978.6107 77.7679 81.3565 81.5073 80.3229</p> <p>22 December 2012</p> <p>11</p> <p>Results: Time taken for model generationClassBayes Meta Rule</p> <p>AlgorithmBayesNetwork NaiveBayes AdaboostM1 RotationForest DecisionTable</p> <p>Time (seconds)7.31 2.03 32.77 1719.03 111.36 62.14 45.4 423.25 18.87</p> <p>Nnge J48NBTree Tree RandomForest RandomTree REPTree SimpleCART</p> <p>2.13 7.46266.09</p> <p>22 December 2012</p> <p>12</p> <p>Results: Memory performanceClassBayes Meta Rule</p> <p>AlgorithmBayesNetwork NaiveBayes AdaboostM1 RotationForest DecisionTable</p> <p>Memory Used (GB)3.7 3.7 2.8 4.6 2.8 2.8 2.8 3.7 2.8</p> <p>Nnge J48NBTree Tree RandomForest RandomTree REPTree SimpleCART</p> <p>2.8 4.63.7</p> <p>22 December 2012</p> <p>13</p> <p>Conclusion Accuracy of models alone will not be a good criteria for</p> <p>evaluation of classifier algorithms. Other criteria such as time taken for model generation and memory performance during model generation and testing are important. Decision Tree based classifier algorithms are better suited for NIDS. Bayesian learning algorithms follow. Overall, J48 classification algorithm, followed by Random Tree gave the best results considering all criteria.</p> <p>22 December 2012</p> <p>14</p> <p>References1. Jiong Zhang, Mohammad Zulkernine: Network Intrusion Detection using Random Forests, PST (2005). 2. Nils J. Nilsson, Introduction to Machine Learning. California, United Stated of America (1999). 3. Mohd Fauzi bin Othman, Thomas Moh Shan Yau, Comparison of Different Classification Techniques Using Weka for Breast Cancer International Conference on Biomedical Engineering (BIOMED) 2006 11-14 December 2006, Kuala Lumpur, (2006). 4. David Poole, Alan Mackworth, Artificial Intelligence: Foundations of Computational Agents, Cambridge University Press, (2010). 5. T. Schaul and J. Schmidhuber. Metalearning. Scholarpedia, 5(6):4650, (2010). 6. George H. John and Pat Langley, Estimating Continuous Distributions in Bayesian Classifiers. Proceedings of the Eleventh Conference on Uncertainty in Artificial Intelligence. pp. 338-345. Morgan Kaufmann, San Mateo (1995). 7. Barros, R.C.; Basgalupp, M.P.; de Carvalho, A.C.P.L.F.; Freitas, A.A.; , "A Survey of Evolutionary Algorithms for Decision-Tree Induction," Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on , vol.42, no.3, pp.291-312, (May 2012). 8. Yoav Freund and Robert E. Schapire , Experiments with a new boosting algorithm. Proc International Conference on Machine Learning, pages 148-156, Morgan Kaufmann, San Francisco (1996). 9. Ian H.Witten, Eibe Frank. Data Mining Practical Machine Learning Tools and Techniques, (2005).</p> <p>22 December 2012</p> <p>15</p> <p>References (contd.)10. Juan J. Rodriguez, Ludmila I. Kuncheva, Carlos J. Alonso, Rotation Forest: A new classifier ensemble method. IEEE Transactions on Pattern Analysis and Machine Intelligence. 28(10):1619-1630, (2006). 11. Leonard Bolc, Ryszard Tadeusiewicz, Leszek J. Chmielewski, Konrad W. Wojciechowski: Computer Vision and Graphics - International Conference, ICCVG 2010, Warsaw, Poland, September 20-22, 2010, Proceedings, Part II Springer (2010). 12. Weka software, Machine Learning, [http://www.cs.waikato.ac.nz/ml/Weka/], The University of Waikato, Hamilton, New Zealand. 13. Overview for extended Weka including Ensembles of Hierarchically Nested Dichotomies, [http://www.dbs.informatik.unimuenchen.de/~zimek/diplomathesis/implementations/EHNDs/doc/]. 14. Tavallaee, M.; Bagheri, E.; Wei Lu; Ghorbani, A.A.; , "A detailed analysis of the KDD CUP 99 data set," Computational Intelligence for Security and Defense Applications, 2009. CISDA 2009. IEEE Symposium on , vol., no., pp.1-6, 8-10, (July 2009). 15. Hettich, S. and Bay, S. D. (1999). The UCI KDD Archive [http://kdd.ics.uci.edu]. Irvine, CA: University of California, Department of Information and Computer Science. 16. The NSL KDD Dataset, [http://nsl.cs.unb.ca/NSL-KDD/], (2009). 17. Weka (machine learning) article on Wikipedia, [http://en.wikipedia.org/wiki/Weka_(machine_learning)].</p> <p>22 December 2012</p> <p>16</p> <p>THANK YOUDr. Srinivasa K G kgsrinivas@msrit.edu Pavan Kumar P N pavan24@ieee.org</p>