تعداد نشریات | 38 |
تعداد شمارهها | 1,240 |
تعداد مقالات | 8,994 |
تعداد مشاهده مقاله | 7,843,825 |
تعداد دریافت فایل اصل مقاله | 4,705,329 |
بهبود کارایی شبکه عصبی کانولووشنال با استفاده از تابع ضرر وزندار افزایشی برای مقابله با نامتوازنی دستهای | ||
پدافند الکترونیکی و سایبری | ||
مقاله 2، دوره 11، شماره 4 - شماره پیاپی 44، اسفند 1402، صفحه 17-34 اصل مقاله (1.94 M) | ||
نوع مقاله: مقاله پژوهشی | ||
نویسندگان | ||
نسیبه محمودی1؛ حسین شیرازی* 2؛ محمد فخردانش3؛ کوروش داداش تبار احمدی3 | ||
1دانشجوی دکتری، دانشکده مهندسی برق و کامپیوتر، دانشگاه صنعتی مالک اشتر | ||
2دانشیار، دانشکده مهندسی برق و کامپیوتر، دانشگاه صنعتی مالک اشتر | ||
3استادیار، دانشکده مهندسی برق و کامپیوتر، دانشگاه صنعتی مالک اشتر | ||
تاریخ دریافت: 08 شهریور 1402، تاریخ بازنگری: 21 آبان 1402، تاریخ پذیرش: 22 آذر 1402 | ||
چکیده | ||
باتوجه به اینکه بیشتر مسائل دنیای واقعی از ﻗﺒﯿﻞ تشخیص تقلب، شناسایی خطا، ﺗﺸﺨﯿﺺ ﻧﺎﻫﻨﺠﺎری، ﺗﺸﺨﯿﺺ ﭘﺰشکی و تشخیص بدافزار نامتوازن هستند، دستهبندی دادهﻫﺎ در مسائل ﻧﺎمتوازن ﺑﻪ ﻋﻨﻮان یکی از ﭼﺎﻟﺶﻫﺎی اصلی در حوزة دادهﮐﺎوی، ﻣﻮرد ﺗﻮﺟﻪ ﺑﺴﻴﺎری از ﻣﺤﻘﻘﺎن و ﭘﮋوﻫﺶﮔﺮان ﻗﺮارﮔﺮﻓﺘﻪ اﺳﺖ. در یادگیری نامتوازن، ﻣﻌﻤﻮﻻ ﺗﻌﺪاد ﻧﻤﻮﻧﻪﻫﺎی یکی از دستهﻫﺎ ﺧﯿلی ﺑﯿﺸﺘﺮ از ﻧﻤﻮﻧﻪﻫﺎی دسته دیگر اﺳﺖ و یا هزینه دستهبندی اشتباه در دودسته متفاوت است. شبکههای عصبی کانولووشنال بهرغم موفقیتهای چشمگیری که در دستهبندی دادهها دارند، در مسائل نامتوازن با مشکل مواجه میشوند چرا که آنها بهصورت پیشفرض، ﺗﻮزﯾﻊ دستهﻫﺎ را متوازن و هزینه دستهبندی را مساوی در ﻧﻈﺮ ﮔﺮﻓﺘﻪ میگیرند، ازاینرو در دستهبندی نامتوازن، نمیتوان به ﻧﺘﺎﯾﺞ قابلقبولی دستیافت؛ زﯾﺮا شبکه ﺑﻪ ﺳﻤﺖ ﻧﻤﻮﻧﻪﻫﺎی آﻣﻮزشی دسته ﺑﺰرگﺗﺮ ﻣﺘﻤﺎﯾﻞ میشود ﮐﻪ اﯾﻦ ﻣﻮﺿﻮع ﺳﺒﺐ اﻓﺰاﯾﺶ ﺗﻌﺪاد ﺧﻄﺎﻫﺎ در تشخیص نمونهﻫﺎی ﻣﺜﺒﺖ میﺷﻮد. یکی از راهکارهای کمهزینه برای غلبه بر نامتوازنی دادهها در شبکههای عصبی کانولوشنال استفاده از تابع ضرر به نفع دسته اقلیت است، در این مقاله تابع ضرری جدیدی معرفی شده است که بهصورت تدریجی و با پیشرفت آموزش، اهمیت دسته اقلیت را افزایش میدهد تا در انتهای آموزش به مقدار مشخص شده برسد و از اهمیت دادههای دسته اکثریت بکاهد، این امر باعث میشود تا هم بتوانیم از قدرت آموزشی همه دادهها استفاده کنیم و هم از غلبه دادههای دسته اکثریت جلوگیری کنیم. نتایج آزمایش روی سه مجموعهدادة مصنوعی، تشخیص فعالیتهای انسان و Cifar-10، همگرایی و کارایی روش پیشنهادی را نشان میدهند، روش پیشنهادی با روشهای آدابوست مبتنی بر درخت تصمیم، شبکه کانولوشنال مبتنی بر آنتروپی متقابل و آنتروپی متقابل وزندار، روش SMOTE و روش CNN تجمعی مقایسه شده است. به ترتیب با کسب دقت 6/94، 92/92 و 23/69 در سه مجموعهداده (Cifar-10 با نرخ نامتوازنی 5 درصد) توانست از دیگر روشها پیشی بگیرد و دقت در مجموعهداده مصنوعی نسبت به روش سنتی آدابوست مبتنی درخت تصمیم، 72/17 بالاتر است. | ||
کلیدواژهها | ||
نامتوازنی دستهای؛ یادگیری عمیق؛ تابع ضرر؛ آنتروپی متقابل | ||
موضوعات | ||
فناوری های نوین دفاع الکترونیک و سایبری | ||
عنوان مقاله [English] | ||
Improving the Performance of the Convolutional Neural Network Using Incremental Weight loss Function to Deal with Class Imbalanced Data | ||
نویسندگان [English] | ||
Nasibeh Mahmoodi1؛ Hossein Shirazi2؛ Mohammad Fakhredanesh3؛ Kourosh Dadashtabar Ahmadi3 | ||
1PhD student, Faculty of Electrical and Computer Engineering, Malik Ashtar University of Technology | ||
2Associate Professor, Faculty of Electrical and Computer Engineering, Malik Ashtar University of Technology | ||
3Assistant Professor, Faculty of Electrical and Computer Engineering, Malik Ashtar University of Technology | ||
چکیده [English] | ||
Class-imbalanced datasets are common in many real-world domains, such as health, banking, and security. Machine learning researchers have recently focused on the classification of such datasets, where the costs of different types of misclassifications are unequal, the classes have different prior probabilities, or both. The performance of most standard classifier learning algorithms is significantly affected by class imbalance, where the algorithms are often biased toward the majority class instances despite recent advances in deep learning. However, there is very little empirical work on deep learning with class imbalance.To address this issue, we propose an incremental weighted cross entropy loss function. The proposed method involves gradually increasing the weight of the minority class as the training progresses, until it reaches the specified amount at the end of the training. Through experiments, we demonstrate the convergence and efficiency of the proposed method. The results of experiments on three datasets, including artificial datasets, human activity recognition dataset, and CIFAR-10, demonstrate the convergence and performance of the proposed method. The proposed method is compared with decision tree-based AdaBoost, Cross Entropy-based convolutional neural network, weighted Cross Entropy -based CNN, SMOTE method, and ensemble CNNs method. With accuracy gains of 94.6%, 92.92%, and 69.23% on the three datasets (CIFAR-10 with 5% imbalance rate), the proposed method outperformed the other methods. Additionally, the accuracy on the artificial dataset was 17.77% higher than the traditional decision tree-based AdaBoost method. | ||
کلیدواژهها [English] | ||
Class-imbalanced dataset, Convolutional Neural Network, Loss function, Cross-entropy | ||
مراجع | ||
[1] M. Khosravi, H. Shirazi, K. Dadeshtabar Ahmadi, and S. A. Hashemi Golpayegani, "Rumour detection in social networks based on the analysis of the frequency pattern of vertices in the step-by-step propagation subgraphs," (in Persian), Electronic and Cyber Defense, vol. 10, no. 3, pp. 93-105, 2022 [2] J. M. Johnson and T. M. Khoshgoftaar, "Survey on deep learning with class imbalance," Journal of Big Data, vol. 6, no. 1, p. 27, 2019/03/19 2019, doi: 10.1186/s40537-019-0192-5. [3] G. Aguiar, B. Krawczyk, and A. Cano, "A survey on learning from imbalanced data streams: taxonomy, challenges, empirical study, and reproducible experimental framework," 2022. [Online]. Available: http://arXiv.org/abs/. [4] C. Huang, Y. Li, C. C. Loy, and X. Tang, "Learning Deep Representation for Imbalanced Classification," in 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 27-30 June 2016 2016, pp. 5375-5384, doi: 10.1109/CVPR.2016.580. [5] W. Zhang, X. Li, X.-D. Jia, H. Ma, Z. Luo, and X. Li, "Machinery fault diagnosis with imbalanced data using deep generative adversarial networks," Measurement, vol. 152, p. 107377, 2020/02/01/ 2020, doi: https://doi.org/10.1016/j.measurement.2019.107377. [6] Q. Dong, S. Gong, and X. Zhu, "Imbalanced Deep Learning by Minority Class Incremental Rectification," IEEE Trans. Pattern Anal. Mach. Intell., vol. 41, no. 6, pp. 1367–1381, 2019, doi: 10.1109/tpami.2018.2832629. [7] R. Anand, K. G. Mehrotra, C. K. Mohan, and S. Ranka, "An improved algorithm for neural network classification of imbalanced training sets," IEEE Transactions on Neural Networks, vol. 4, no. 6, pp. 962-969, 1993, doi: 10.1109/72.286891. [8] D. Masko and P. Hensman, "The Impact of Imbalanced Training Data for Convolutional Neural Networks," 2015. [9] H. Lee, M. Park, and J. Kim, "Plankton classification on imbalanced large scale database via convolutional neural networks with transfer learning," 2016 IEEE International Conference on Image Processing (ICIP), pp. 3713-3717, 2016. [10] S. Pouyanfar et al., "Dynamic Sampling in Convolutional Neural Networks for Imbalanced Data Classification," in 2018 IEEE Conference on Multimedia Information Processing and Retrieval (MIPR), 10-12 April 2018 2018, pp. 112-117, doi: 10.1109/MIPR.2018.00027. [11] M. Havaei et al., "Brain tumor segmentation with Deep Neural Networks," Medical Image Analysis, vol. 35, pp. 18-31, 2017/01/01/ 2017, doi: https://doi.org/10.1016/j.media.2016.05.004. [12] M. Buda, A. Maki, and M. A. Mazurowski, "A systematic study of the class imbalance problem in convolutional neural networks," Neural Networks, vol. 106, pp. 249-259, 2018/10/01/ 2018, doi: https://doi.org/10.1016/j.neunet.2018.07.011. [13] S. Wang, W. Liu, J. Wu, L. Cao, Q. Meng, and P. J. Kennedy, "Training deep neural networks on imbalanced data sets," ed. Piscataway, NJ: Institute of Electrical and Electronics Engineers (IEEE), 2016, pp. 4368-4374. [14] T. Lin, P. Goyal, R. Girshick, K. He, and P. Dollár, "Focal Loss for Dense Object Detection," in 2017 IEEE International Conference on Computer Vision (ICCV), 22-29 Oct. 2017 2017, pp. 2999-3007, doi: 10.1109/ICCV.2017.324. [15] H. Wang, Z. Cui, Y. Chen, M. Avidan, A. B. Abdallah, and A. Kronzer, "Predicting Hospital Readmission via Cost-Sensitive Deep Learning," IEEE/ACM Transactions on Computational Biology and Bioinformatics, Article vol. 15, no. 6, pp. 1968-1978, 11/1 2018, doi: 10.1109/TCBB.2018.2827029. [16] S. H. Khan, M. Hayat, M. Bennamoun, F. A. Sohel, and R. Togneri, "Cost-Sensitive Learning of Deep Feature Representations From Imbalanced Data," IEEE Trans. Neural Networks Learn. Syst., vol. 29, no. 8, pp. 3573-3587, / 2018, doi: 10.1109/TNNLS.2017.2732482. [17] C. Zhang, K. C. Tan, and R. Ren, "Training cost-sensitive Deep Belief Networks on imbalance data problems," in 2016 International Joint Conference on Neural Networks (IJCNN), 24-29 July 2016 2016, pp. 4362-4367, doi: 10.1109/IJCNN.2016.7727769. [18] Y. Zhang, L. Shuai, Y. Ren, and H. Chen, "Image classification with category centers in class imbalance situation," in 2018 33rd Youth Academic Annual Conference of Chinese Association of Automation (YAC), 18-20 May 2018 2018, pp. 359-363, doi: 10.1109/YAC.2018.8406400. [19] W. Ding, D. Huang, Z. Chen, X. Yu, and W. Lin, "Facial action recognition using very deep networks for highly imbalanced class distribution," in 2017 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC), 12-15 Dec. 2017 2017, pp. 1368-1372, doi: 10.1109/APSIPA.2017.8282246. [20] S. Ando and C. Y. Huang, "Deep Over-sampling Framework for Classifying Imbalanced Data," in Machine Learning and Knowledge Discovery in Databases, Cham, M. Ceci, J. Hollmén, L. Todorovski, C. Vens, and S. Džeroski, Eds., 2017// 2017: Springer International Publishing, pp. 770-785. [21] A. Y. A. Saeed and A. E. B. Alawi, "Covid-19 Diagnosis Model Using Deep Learning with Focal Loss Technique," in 2021 International Congress of Advanced Technology and Engineering (ICOTEN), 4-5 July 2021 2021, pp. 1-4, doi: 10.1109/ICOTEN52080.2021.9493477. [22] A. Taherkhani, G. Cosma, and T. M. McGinnity, "AdaBoost-CNN: An adaptive boosting algorithm for convolutional neural networks to classify multi-class imbalanced datasets using transfer learning," Neurocomputing, vol. 404, pp. 351-366, 2020/09/03/ 2020, doi: https://doi.org/10.1016/j.neucom.2020.03.064. [23] T. J. Hastie, S. Rosset, J. Zhu, and H. Zou, "Multi-class AdaBoostB " Statistics and Its Interface, vol. 2, pp. 349-360, 2009. [24] O. EL ZEIN, M. M. SOLIMAN, A. ELKHOLY, and N. I. GHALI, "MULTI-CLASSIFICATION MODEL FOR COVID-19 PREDICTION USING IMBALANCED X-RAY DATASET BASED ON TRANSFER LEARNING AND CLASS WEIGHTING-SMOTE METHOD," Journal of Theoretical and Applied Information Technology, vol. 100, no. 5, 2022. [25] م. خالوئی, م. فخردانش, and م. سبک رو, "تشخیص و مکانیابی رویدادهای رایج و نادر در ویدیو با بکارگیری شبکه تخاصمی مولد," (in persian), مجله علمی رایانش نرم و فناوری اطلاعات, vol. 8, no. 3, pp. 40-51, 2019. [Online]. Available: http://jscit.nit.ac.ir/article_93041.html http://jscit.nit.ac.ir/article_93041_98448d599e92c5238380a9006d717889.pdf. [26] A. Choromanska, M. Henaff, M. Mathieu, G. B. Arous, and Y. LeCun, "The loss surfaces of multilayer networks," Journal of Machine Learning Research, Conference article vol. 38, pp. 192-204, 2015. [Online]. Available: http://www.scopus.com/inward/record.url?scp=84954310140&partnerID=8YFLogxK http://www.scopus.com/inward/citedby.url?scp=84954310140&partnerID=8YFLogxK. [27] K. Kawaguchi, Deep Learning without Poor Local Minima, S. a. D. Lee and and U. L. , I. Guyon , R. Garnett, eds.: Curran Associates, Inc., 2016. [Online]. Available: https://proceedings.neurips.cc/paper/2016/file/f2fc990265c712c49d51a18a32b39f0c-Paper.pdf. [28] B. Guo, S. Chen, Z. Hong, and G. Xu, "Pattern Recognition and Analysis: Neural Network using Weighted Cross Entropy," Journal of Physics: Conference Series, vol. 2218, no. 1, p. 012043, 2022/03/01 2022, doi: 10.1088/1742-6596/2218/1/012043. [29] Z. Zhou, H. Huang, and B. Fang, "Application of weighted cross-entropy loss function in intrusion detection," Journal of Computer and Communications, vol. 9, no. 11, pp. 1-21, 2021. [30] P. Shamsolmoali, M. Zareapoor, L. Shen, A. H. Sadka, and J. Yang, "Imbalanced data learning by minority class augmentation using capsule adversarial networks," Neurocomputing, 2020/07/28/ 2020, doi: https://doi.org/10.1016/j.neucom.2020.01.119. [31] J. R. Kwapisz, G. M. Weiss, and S. A. Moore, "Activity Recognition Using Cell Phone Accelerometers," SIGKDD Explor. Newsl., vol. 12, no. 2, pp. 74–82, 2011, doi: 10.1145/1964897.1964918. [32] A. Krizhevsky, "Learning Multiple Layers of Features from Tiny Images," University of Toronto, 05/08 2012. [33] N. Chawla, K. Bowyer, L. Hall, and W. Kegelmeyer, "SMOTE: Synthetic Minority Over-sampling Technique," J. Artif. Intell. Res. (JAIR), vol. 16, pp. 321-357, 01/01 2002, doi: 10.1613/jair.953. [34] J. H. Joloudari, A. Marefat, M. A. Nematollahi, S. S. Oyelere, and S. Hussain, "Effective Class-Imbalance Learning Based on SMOTE and Convolutional Neural Networks," Applied Sciences, vol. 13, no. 6, doi: 10.3390/app13064006. | ||
آمار تعداد مشاهده مقاله: 234 تعداد دریافت فایل اصل مقاله: 230 |