Applying Decision Tree Algorithm Classification and Regression Tree (CART) Algorithm to Gini Techniques Binary Splits

Main Article Content

Dr. Nirmla Sharma
Sameera Iqbal Muhmmad Iqbal

Abstract

Decision tree study is a predictive modelling tool that is used over many grounds. It is constructed through an algorithmic technique that is divided the dataset in different methods created on varied conditions. Decisions trees are the extreme dominant algorithms that drop under the set of supervised algorithms. However, Decision Trees appearance modest and natural, there is nothing identical modest near how the algorithm drives nearby the procedure determining on splits and how tree snipping happens. The initial object to appreciate in Decision Trees is that it splits the analyst field, i.e., the objective parameter into diverse subsets which are comparatively more similar from the viewpoint of the objective parameter. Gini index is the name of the level task that has applied to assess the binary changes in the dataset and worked with the definite object variable “Success” or “Failure”. Split creation is basically covering the dataset values. Decision trees monitor a top-down, greedy method that has recognized as recursive binary splitting. It has statistics for 15 statistics facts of scholar statistics on pass or fails an online Machine Learning exam. Decision trees are in the class of supervised machine learning. It has been commonly applied as it has informal implement, interpreted certainly, derived to quantitative, qualitative, nonstop, and binary splits, and provided consistent outcomes. The CART tree has regression technique applied to expected standards of nonstop variables. CART regression trees are an actual informal technique of understanding outcomes.

Downloads

Download data is not yet available.

Article Details

How to Cite
[1]
Dr. Nirmla Sharma and Sameera Iqbal Muhmmad Iqbal , Trans., “Applying Decision Tree Algorithm Classification and Regression Tree (CART) Algorithm to Gini Techniques Binary Splits”, IJEAT, vol. 12, no. 5, pp. 77–81, Jun. 2023, doi: 10.35940/ijeat.E4195.0612523.
Section
Articles

How to Cite

[1]
Dr. Nirmla Sharma and Sameera Iqbal Muhmmad Iqbal , Trans., “Applying Decision Tree Algorithm Classification and Regression Tree (CART) Algorithm to Gini Techniques Binary Splits”, IJEAT, vol. 12, no. 5, pp. 77–81, Jun. 2023, doi: 10.35940/ijeat.E4195.0612523.
Share |

References

Navada, A., Ansari, A., Patil P., and B. Sonkamble, “Overview of use of decision tree algorithms in machine learning,” in 2011 IEEE control and system graduate research colloquium, pp. 37–42, Malaysia, June 2011.

Sekeroglu, B., Hasan, S. S., Abdullah, S. M., Adv. Comput. Vis. 491, 2020

Lakshmi, T., Aruldoss M., Begum, R. M., and Venkatesan V., “An analysis on performance of decision tree algorithms using student’s qualitative data,” International Journal of Modern Education and Computer Science., vol. 5, no. 5, pp. 18–27, 2013.

Singh, K., “The comparison of various decision tree algorithms for data analysis,” International Journal of Engineering and Computer Science, vol. 6, no. 6, pp. 21557–21562, 2017.

Chary, S. N. and Rama, B., “A survey on comparative analysis of decision tree algorithms in data mining,” International Journal of Mathematical, Engineering and Management Sciences., vol. 3, pp. 91–95, 2017.

Pathak, S., Mishra, I., and Swetapadma A., “An Assessment of Decision Tree Based Classification and Regression Algorithms,” in 2018 3rd International Conference on Inventive Computation Technologies (ICICT), pp. 92–95, Coimbatore, India, November 2018.

Moghimipour, I. and Ebrahimpour, M., “Comparing decision tree method over three data mining software,” International Journal of Statistics and Probability., vol. 3, no. 3, 2014.

Almasoud, A. M., Al-Khalifa, H. S., and Al-Salman A., “Recent developments in data mining applications and techniques,” in 2015 Tenth International Conference on Digital Information Management (ICDIM), 2015, pp. 36–42.

Anuradha, C. and Velmurugan, T, A data mining-based survey on student performance evaluation system, 2014 IEEE International Conference on Computational Intelligence and Computing Research, 2014, pp. 1–4.

Cherfi, A., Nouira, K., and Ferchichi, A. (2018). Very Fast C4.5 Decision Tree Algorithm, Journal of Applied Artificial Intelligence, 2018, 32(2), pp. 119-139

Mhetre, V. and Nagar, M., Classification based data mining algorithms to predict slow, average and fast learners in educational system using WEKA, in 2017 International Conference on Computing Methodologies and Communication (ICCMC), 2017, pp. 475–479.

Li, M., Application of CART decision tree combined with PCA algorithm in intrusion detection, Presented at the 2017 8th IEEE International Conference on Software Engineering and Service Science (ICSESS), 2017, pp. 38–41.

Rehman, T. U., Mahmud, M., S., Chang, J. K., Jin, Shin, J. Comp. Electron. Agric. 156, 585 (2019).

Chandrasekar, P., Qian, K., Shahriar, H. and Bhattacharya, P., Improving the Prediction Accuracy of Decision Tree Mining with Data Preprocessing, 2017 IEEE 41st Annual Computer Software and Applications Conference (COMPSAC), 2017, pp. 481– 484.

Yi-bin, L., Ying-ying, W. and Xue-wen, R., Improvement of ID3 algorithm based on simplified information entropy and coordination degree, in 2017 Chinese Automation Congress (CAC), 2017, pp. 1526–1530.

Chen, F., Li, X. and Liu L., Improved C4.5 decision tree algorithm based on sample selection, in 2013 IEEE 4th International Conference on Software Engineering and Service Science, 2013, pp. 779–782.

M. A. Muslim, M. A., Nurzahputra, A. and Prasetiyo, B. improving accuracy of C4.5 algorithm using split feature reduction model and bagging ensemble for credit card risk prediction, in 2018 IEEE International Conference on ICT (ICOIACT), 2018, pp. 141–145.