Vestnik KRAUNC. Fiz.-Mat. Nauki. 2022. vol. 39. no. 2. pp. 136–149. ISSN 2079-6641

Contents of this issue

Read Russian Version US Flag

MSC 68Т27

Research Article

Clustering algorithm based on feature space partitioning

M. A. Kazakov

Institute of Applied Mathematics and Automation KBSC RAS, 360000, Nalchik, Shortanova st., 89a, Russia

E-mail: kasakow.muchamed@gmail.com

A new approach to robust clustering is proposed based on recursive partitioning of the feature space and density analysis. An algorithm for robust clustering of linearly inseparable points, its software implementation, as well as test results on classical data distributions are presented.

Key words: clustering, robust clustering, machine learning.

DOI: 10.26117/2079-6641-2022-39-2-136–149

Original article submitted: 05.07.2022

Revision submitted: 22.08.2022

For citation. Kazakov M. A. Clustering algorithm based on feature space partitioning. Vestnik KRAUNC. Fiz.-mat. nauki. 2022, 39: 2, 136–149. DOI: 10.26117/2079-6641-2022-39-2-136–149

Competing interests. The author declares that there are no conflicts of interest with respect to authorship and publication.

Contribution and responsibility. The author contributed to the writing of the article and is solely responsible for submitting the final version of the article to the press. The final version of the manuscript was approved by the author.

The content is published under the terms of the Creative Commons Attribution 4.0 International License (https://creativecommons.org/licenses/by/4.0/deed.ru)

© Kazakov M. A., 2022

References

  1. Geron A. Hands-on machine learning with Scikit-Learn, Keras, and TensorFlow: Concepts, tools, and techniques to build intelligent systems. O’Reilly Media, Inc., 2019, pp. 856.
  2. Raschka S. Python machine learning. Packt publishing ltd: 1st edition, 2015, pp. 456.
  3. Muller A. C., Guido S. Introduction to machine learning with Python: a guide for data scientists. O’Reilly Media: 1st edition, 2016, pp. 398.
  4. MacQueen J. B. Some Methods for classification and Analysis of Multivariate Observations, Proceedings of 5th Berkeley Symposium on Mathematical Statistics and Probability, 1967, vol. 1, pp. 281–297.
  5. Lloyd S., Stuart P. Least square quantization in PCM, IEEE Transactions on Information Theory, 1982, vol. 28, no. 2, pp. 129–137.
  6. Sibson R. SLINK: an optimally efficient algorithm for the single-link cluster method, The Computer Journal. British Computer Society, 1973, vol. 16, no. 1, pp. 30–34.
  7. Defays D. An efficient algorithm for a complete link method, The Computer Journal. British Computer Society, 1977. vol. 20, no. 4, pp. 364–366.
  8. Ester M., Kriegel H. P., Sander J., Xu X. A density-based algorithm for discovering clusters in large spatial databases with noise, KDD, 1996. vol. 96, no. 34, pp. 226–231.
  9. Sander J., et al. Density-based clustering in spatial databases: The algorithm gdbscan and its applications, Data mining and knowledge discovery, 1998, vol. 2, no. 2, pp. 169–194.
  10. Shibzukhov Z. M. On the Principle of Empirical Risk Minimization Based on Averaging Aggregation Functions, Doklady Mathematics, 2017, vol. 96, no. 2, pp. 494–497.
  11. Shibzukhov Z.M. On a Robust Approach to Search for Cluster Centers, Automation and Remote Control, 2021. vol. 82, No 10, pp.1742–1751 DOI: 10.1134/S0005117921100118.
  12. Shibzukhov Z. M. Machine Learning Based on the Principle of Minimizing Robust Mean Estimates, Brain-Inspired Cognitive Architectures for Artificial Intelligence: BICA*AI 2020, 2020. vol. 1310, pp. 472–477. DOI:10.1007/978-3-030-65596-956.
  13. Kharinov M. V. Superpixel Clustering, International Russian Automation Conference (RusAutoCon). IEEE, 2021, pp. 303–308. DOI: 10.1109/RusAutoCon52004.2021.9537461.
  14. Huang D., Wang C. D., Lai J. H. Locally weighted ensemble clustering, IEEE transactions on cybernetics, 2017. vol. 48, no. 5, pp. 1460–1473.DOI: 10.1109/TCYB.2017.2702343.
  15. Debnath T., Song M. Fast Optimal Circular Clustering and Applications on Round Genomes, IEEE/ACM Transactions on Computational Biology and Bioinformatics, 2021. vol. 18, no. 6, pp. 2061–2071. DOI: 10.1109/TCBB.2021.3077573.
  16. Nock R., Nielsen F. On Weighting Clustering, IEEE Transactions on Pattern Analysis and Machine Intelligence, 2006. vol. 28, No 8, pp. 1223–1235. DOI: 10.1109/TPAMI.2006.168.
  17. Kaur P. J., et al. Cluster quality based performance evaluation of hierarchical clustering method, IEEE, 2015, pp. 649–653. DOI: 10.1109/NGCT.2015.7375201.
  18. Flach P. Machine learning: the art and science of algorithms that make sense of data. Cambridge University Press: 1st edition, 2012, pp. 416.
  19. Shu M. L., et al. Planning the obstacle-avoidance trajectory of mobile anchor in 3D sensor networks, Science China Inform. Sci., 2015, vol. 58, no. 10, pp. 1–10.
  20. Ankerst M., Breunig M., Kriegel H. P., Sander J. OPTICS: Ordering Points To Identify the Clustering Structure, ACM SIGMOD international conference on Management of data. ACM Press., 1999. vol. 28, No 2, pp. 49–60. DOI: 10.1145/304181.304187.
  21. Achtert E., Bohm C., Kroger P. DeLi-Clu: Boosting Robustness, Completeness, Usability, and Efficiency of Hierarchical Clustering by a Closest Pair Ranking, Advances in Knowledge Discovery and Data Mining. Lecture Notes in Computer Science, 2006, vol. 3918, pp. 119–128. DOI: 10.1007/1173113916.

Kazakov Mukhamed Anatolevich – Junior Researcher of the Department of Neural Networks and Machine Learning, Institute of Applied Mathematics and Automation, Kabardino-Balkar Republic, Nalchik, Russia, ORCID 0000-0002-1576-1860