Okur, ErdemTürkan, Mehmet2023-10-272023-10-2720240957-41741873-6793https://doi.org/10.1016/j.eswa.2023.121531https://hdl.handle.net/20.500.14365/4891The human skin, the largest organ with multiple layered functionalities, houses melanocytes in the deeper strata of its epidermis. These cells can be adversely impacted by ultraviolet radiation, thereby instigating melanoma, the deadliest form of skin cancer. Failure to detect melanoma at an early stage can potentially lead to metastasis, forming complex tumors in other tissues. Despite substantial efforts, visual inspections can occasionally overlook melanoma cases due to inherent subjectivity. To surmount this challenge, an automated detection system is necessary. Recent attempts to establish such a system have predominantly employed push-throughstrategies involving deep (neural) networks and their ensembles, which however necessitate significant computational resources. This paper presents a novel approach, amalgamating a conventional machine learning technique, Bag of Visual Words, with a pretrained deep neural network for comprehensive deep feature extraction from enhanced input image patches. The proposed method, assessed on the ISIC Challenge 2017 dataset, surpassed all other entries on the challenge leader-board, registering an accuracy of 96.2% in the task of lesion classification.eninfo:eu-repo/semantics/closedAccessMelanomaNeural networksBag of Visual WordsFeature extractionSkin cancerISIC challengeConvolutional Neural-NetworksDermoscopyClassificationWeighted Bag of Visual Words With Enhanced Deep Features for Melanoma DetectionArticle10.1016/j.eswa.2023.1215312-s2.0-85171617135