Skip to main content
Log in

The Architecture and Performance of a Stochastic Competitive Evolutionary Neural Tree Network

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

A new dynamic tree structured network—the Stochastic Competitive Evolutionary Neural Tree (SCENT) is introduced. The network is able to provide a hierarchical classification of unlabelled data sets. The main advantage that SCENT offers over other hierarchical competitive networks is its ability to self-determine the number and structure of the competitive nodes in the network without the need for externally set parameters. The network produces stable classificatory structures by halting its growth using locally calculated, stochastically controlled, heuristics. The performance of the network is analysed by comparing its results with that of a good non-hierarchical clusterer, and with three other hierarchical clusterers and its non stochastic predecessor. SCENT's classificatory capabilities are demonstrated by its ability to produce a representative hierarchical structure to classify a broad range of data sets.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. J. Hertz, A. Krogh, and R.G. Palmer, An Introduction to the Theory of Neural Computation, Adddison Wesley, 1991.

  2. P.H. Sneath and R.R. Sokal, Numerical Taxonomy, the Principles and Practice of Numerical Classification, W.H. Freeman and Company: San Francisco, 1973.

    Google Scholar 

  3. T. Li, Y.Y. Fang, and L.Y. Fang, “Astructure-parameter-adaptive (SPA) neural tree for the recognition of large character set,” Pattern Recognition, vol. 28, no. 4, pp. 315–329, 1995.

    Google Scholar 

  4. J. Racz and T. Klotz, “Knowledge representation by dynamic competitive learning techniques,” SPIE Applications of Artificial Neural Networks II, vol. 1469, pp. 778–783, 1991.

    Google Scholar 

  5. K. Butchart, N. Davey, and R.G. Adams, “An investigation into the performance and representations of a stochastic evolutionary neural tree,” in Proceedings of the International Conference on the Applications of Neural Networks and Genetic Algorithms(ICANNGA 97), Springer Verlag.

  6. R.G. Adams, N. Davey, and S.G. George, “Analysing hierarchical data using a stochastic evolutionary neural tree,” in Proceedings of the International Symposium on the Engineering of Intelligent Systems(EIS98), 1998, pp. 268–275.

  7. H. Song and S. Lee, “A self-organising neural tree for large-set pattern classification,” IEEE Transactions on Neural Networks, vol. 9, no. 3, pp. 369–380, 1998.

    Google Scholar 

  8. T. Li, Y.Y. Tang, S.C. Suen, L.Y. Fang, and A.J. Jennings, “A structurally adaptive neural tree for recognition of a large character set,” in Proc. 11th IAPR International Joint Conference on Pattern Recognition, 1992, vol. II, pp. 187–190.

    Google Scholar 

  9. K. Butchart, “Hierarchical clustering using a dynamic self organising neural networks,” PhD Thesis, University of Hertfordshire, 1996.

  10. N. Metropolis, A.W. Rosenbluth, M.N. Rosenbluth, A.A. Teller, and E. Teller, “Equations of state calculations by fast computing machines,” Journal Chemical Physics, vol. 21, pp. 1087–1091, 1953.

    Google Scholar 

  11. T. Martinetz, S. Berkovich, and K. Schulten, “Neural-gas network for vector quantisation and its application to time-series prediction,” IEEE Transactions on Neural Networks, vol. 44, no. 4, 1993.

    Google Scholar 

  12. G. Bartfai, “An ART-based modular architecture for learning hierarchical clusterings,” Neurocomputing, vol. 13, pp. 31–46, 1996.

    Google Scholar 

  13. P.R. Krishnaiah and L.N. Kanal, “Classification, pattern recognition, and reduction of dimensionality,” in Handbook of Statistics, vol. 2, North Holland: Amsterdam, 1989.

  14. K. Butchart, N. Davey, and R. Adams, “A comparative study of three neural networks that use soft competition,” in Proceedings of IWANN'95, 1995, pp. 308–314.

  15. N. Pal, J. Bezdec, and E. Tsao, “Generalised clustering networks andKohonen's Self-Organising Scheme,” IEEE Transactions on Neural Networks, vol. 44, no. 4, 1993.

  16. E. Yair, K. Zeger, and A. Gersho, “Competitive learning and soft competition for vector quantiser design,” IEEE Transactions on Signal Processing, vol. 40, no. 2, 1992.

    Google Scholar 

  17. K. Butchart, N. Davey, and R. Adams, “A comparative study of two self organising and structually adaptive dynamic neural tree networks,” in Neural Networks and their Applications, edited by J.G. Taylor, John Wiley, pp. 93–112, 1996.

  18. G.A. Carpenter and S. Grossberg, “Amassively parallel architecture for a self-organising neural pattern recognising machine,” Computer Vision; Graphic and Image Processing, vol. 37, pp. 54–115, 1987.

  19. K. Butchart, N. Davey, and R. Adams, “Hierarchical classification with a stochastic competitive evolutionary neural tree,” in Proceedings of ICNN96, 1996, vol. 2, pp. 1372–1377.

    Google Scholar 

  20. T. Li, L. Fang, and K. Q-Q Li, “Hierarchical classification and vector quantisation with neural trees,” Neurocomputing, vol. 5, pp. 119–139.

  21. T. Gale, “Perception and semantic information in human object recognition: A neuropsychological and connectionist study,” Ph.D. Thesis, University of Hertfordshire, 1997.

  22. P.M. Murphy and D.W. Aha, “Repository of machine learning databases,” Technical Report, Department of Information and Computer Science, University of California, CA, 1992.

  23. B.S. Everit, Cluster Analysis, Edward Arnold: London, 1993.

  24. J.C. Bezdek and N.R. Pal, “Two soft relatives of learning vector quantization,” Neural Networks, vol. 8, no. 5, pp. 729–743, 1995.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Davey, N., Adams, R. & George, S. The Architecture and Performance of a Stochastic Competitive Evolutionary Neural Tree Network. Applied Intelligence 12, 75–93 (2000). https://doi.org/10.1023/A:1008364004705

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1008364004705

Navigation