Read full paper at:
Concept learning is a kind of classification task that has interesting practical applications in several areas. In this paper, a new evolutionary concept learning algorithm is proposed and a corresponding learning system, called ECL (Evolutionary Concept Learner), is implemented. This system is compared to three traditional learning systems: MLP (Multilayer Perceptron), ID3 (Iterative Dichotomiser) and NB (Naïve Bayes). The comparison takes into account target concepts of varying complexities (e.g., with interacting attributes) and different qualities of training sets (e.g., with imbalanced classes and noisy class labels). The comparison results show that, although no single system is the best in all situations, the proposed system ECL has a very good overall performance.
Cite this paper
Morgon, R. and Pereira, S. (2014) Evolutionary Learning of Concepts. Journal of Computer and Communications, 2, 76-86. doi: 10.4236/jcc.2014.28008.
|||Michie, D., Spiegelhalter, D.J. and Taylor, C.C. (1994) Machine Learning, Neural and Statistical Classification. Ellis Horwood, New York. http://www1.maths.leeds.ac.uk/~charles/statlog/whole.pdf|
|||Kotsiants, S.B., Zaharakis, I.D. and Pintelas, P.E. (2006) Machine Learning: A Review of Classification and Combining Techniques. Artificial Intelligence Review, 26, 159-190.
|||Moreira, L.M. (2000) The Use of Boolean Concepts in General Classification Contexts. Ph.D. Thesis, école Polythechnique Fédérale de Lausanne, Lausanne.
|||Menon, A.K., Agarwal, H.N.S. and Chawla, S. (2013) On the Statistical Consistency of Algorithms for Binary Classification under Class Imbalance. Proceedings of the 30th International Conference on Machine Learning, Atlanta, 16-21 June 2013, 603-611.
|||Jakulin, A. (2003) Attribute Interactions in Machine Learning. M.Sc. Thesis, University of Ljubljana, Ljubljana. http://www.stat.columbia.edu/~jakulin/Int/interactions_full.pdf|
|||Natarajan, N., Dhillon, I., Ravikumar, P. and Tewari, A. (2013) Learning with Noisy Labels. Advances in Neural Information Processing Systems, NIPS, 1196-1204. http://papers.nips.cc/paper/5073-learning-with-noisy-labels|
|||Whitley, D. (2001) An Overview of Evolutionary Algorithms: Practical Issues and Common Pitfalls. Information and Software Technology, 43, 817-831. http://dx.doi.org/10.1016/S0950-5849(01)00188-4|
|||Hekanaho, J. (1998) An Evolutionary Approach to Concept Learning. Ph.D. Thesis, Abo Akademi University, Vasa.
|||Thrun, S.B., et al. (1991) The Monk’s Problems—APerformance Comparison of Different Learning Algorithms. Technical Report, Carnigie Mellon University.
|||Labatut, V. and Cherifi, H. (2012) Accuracy Measures for the Comparison of Classifiers. Proceedings of the 5th International Conference on Information Technology, Chania Crete, 7-9 July 2014, 1-5. http://arxiv.org/ftp/arxiv/papers/1207/1207.3790.pdf|
|||De Jong, K.A. (2006) Evolutionay Computation: A Unified Approach. MIT Press, London.|
|||Weise, T. (2008) Global Optimization Algorithms: Theory and Application. 2nd Edition. http://www.it-weise.de|
|||Koza, J.R. (1998) Genetic Programming. MIT Press, London.|
|||Fogel, L.J. (1964) On the Organization of Intellect. Ph.D. Thesis, University of California, Los Angeles.|
|||Rechenberg, I. (1965) Cybernetic Solution Path of an Experimental Problem. Royal Aircraft Establishment, Library Translation 1122, Farnborough.|
|||Witten, I.H., Frank, E. and Hall, M.A. (2011) Data Mining. 3rd Edition, Morgan Kaufmann, Burlington.|
|||Ceder, V.L. (2010) The Quick Python Book. 2nd Edition, Manning Publications Co., Greenwich.|
|||Alcalá-Fdez, J., et al. (2011) KEEL Data-Mining Software Tool: Data Set Repository. Integration of Algorithms and Experimental Analysis Framework. Journal of Multiple-Valued Logic and Soft Computing, 17, 255-287. http://www.keel.es|
|||Bache, K. and Lichman, M. (2013) UCI Machine Learning Repository. University of California, School of Information and Computer Science. http://archive.ics.uci.edu/ml eww150128lx|