Abstract
Feature selection method applied on an intrusion dataset is used to classify the intrusion data as normal or intrusive. Based on the performance evaluation using various feature selection algorithms and the behavior of attributes, we can distinguish the features which plays an important role for detecting intrusions. The dataset has 41 features, out of which some features play significant role in detecting the intrusions, and others do not contribute in the detection process. We have applied different feature selection techniques to extract the predominant feature that are actually effective in detecting intrusions.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Modi, C., Patel, D., Borisaniya, B., Patel, H., Patel, A., Rajarajan, M.: A survey of intrusion detection techniques in cloud. J. Netw. Comput. 36(1), 42–57 (2013)
Koff, W., Gustafson, P.: CSC leading edge forum data revolution. In: CSC LEADING Edge Forum, p. 68 (2011)
Thakare, S.V., Gore, D.V.: Comparative study of CIA. In: Fourth International Conference on Communication Systems and Network Technologies, pp. 713–718 (2014)
Muda, Z., Yassin, W., Sulaiman, M.N., Udzir, N.I.: Intrusion detection based on K-means clustering and OneR classification. In: Proceedings of 2011 7th International Conference on Information Assurance and Security IAS 2011, pp. 192–197 (2011)
Fiore, U., Palmieri, F., Castiglione, A., De Santis, A.: Network anomaly detection with the restricted Boltzmann machine. Neurocomputing 122, 13–23 (2013)
Muda, Z., Yassin, W., Sulaiman, M.N., Udzir, N.I.: Intrusion detection based on K-Means clustering and Naïve Bayes classification. In: 2011 7th International Conference on Information Technology in Asia, pp. 1–6 (2011)
Folino, G., Sabatino, P.: Ensemble based collaborative and distributed intrusion detection systems: a survey. J. Netw. Comput. Appl. 66, 1–16 (2016)
Breman, L.: Random forests. Mach. Learn. 45(1), 5–32 (2001)
Aha, D., Kibler, D.: Instance-based learning algorithms. Mach. Learn. 6, 37–66 (1991)
Quinlan, R.: C4.5: Programs for Machine Learning. Morgan Kaufmann Publishers, San Mateo, CA (1993)
Frank, E., Witten, H.I.: Generating accurate rule sets without global optimization. In: Fifteenth International Conference on Machine Learning, pp. 144–151 (1998)
Keerthi, S.S., Shevade, S.K., Bhattacharyya, C., Murthy, K.R.K.: Improvements to Platt’s SMO algorithm for SVM classifier design. Neural Comput. 13(3), 637–649 (2001)
John, H.G., Langley, P.: Estimating continuous distributions in Bayesian classifiers. In: Eleventh Conference on Uncertainty in Artificial Intelligence, San Mateo, pp. 338–345 (1995)
Kohavi, R.: Scaling up the accuracy of Naïve-Bayes classifiers. A decision-tree hybrid. In: Second International Conference on Knowledge Discovery and Data Mining, pp. 202–207 (1996)
Landwehr, N., Hall, M., Frank, E.: Logistic model trees. Mach. Learn. 95(1–2), 161–205 (2005)
Gama, J.: Functional trees. Mach. Learn. 55(3), 219–250 (2004)
Cleary, J.G., Trigg, L.E.: K*: an instance-based learner using an entropic distance measure. In: 12th International Conference on Machine Learning, pp. 108–114 (1995)
Adnan, M.N., Islam, M.Z.: Forest PA: constructing a decision forest by penalized attributes used in previous trees. Expert Syst. Appl. 89, 389–403 (2017)
Hulten, G., Spencer, L., Domingos, P.: Mining time-changing data streams. In: ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 97–106 (2001)
Perez, J.M., Muguerza, J., Arbelaitz, O., Gurrutxaga, I., Martin, J.I.: Combining multiple class distribution modifies subsamples in a single tree. Pattern Recogn. Lett. 28(4), 414–422 (2007)
Webb, G.: Decision tree grafting from the all-tests-but-one partition. In: IJCAI, San Francisco, CA (1999)
Holmes, G., Pfahringer, B., Kirkby, R., Frank, E., Hall, M.: Multiclass alternating decision trees. In: ECML, pp. 161–172 (2001)
Islam, G.: Knowledge discovery through SysFor—a systematically developed forest of multiple decision trees. In: Australasian Data Mining Conference (AusDM 11), Ballarat, Australia, pp. 195–204 (2011)
Fakhraei, S., Soltanian, H.Z., Fotouhi, F.: Bias and stability of single variable classifiers for feature ranking and selection. Expert Syst. Appl. 41(15), 6945–6958 (2014)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Nanda, M.K., Patra, M.R. (2021). Intrusion Detection and Classification Using Decision Tree-Based Feature Selection Classifiers. In: Mishra, D., Buyya, R., Mohapatra, P., Patnaik, S. (eds) Intelligent and Cloud Computing. Smart Innovation, Systems and Technologies, vol 153. Springer, Singapore. https://doi.org/10.1007/978-981-15-6202-0_17
Download citation
DOI: https://doi.org/10.1007/978-981-15-6202-0_17
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-15-6201-3
Online ISBN: 978-981-15-6202-0
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)