Paper Type

Complete

Abstract

Lightweight CNN architectures, known for their efficiency and speed without compromising accuracy, play a crucial role in addressing the challenges posed by limited training data availability and high computational resource demands. These architectures seamlessly integrate with transfer learning, a pivotal technique in deep learning that allows pre-trained models to be adapted for new tasks. Our study focused on evaluating a multi-stage transfer learning approach across various dataset sizes and truncated versions of the MobileNetV2 architecture for medical image classification. The results reveal that simpler models can perform competitively, whereas more complex models generally deliver higher accuracy. Furthermore, the significance of model complexity on target performance tends to diminish with smaller datasets.

Paper Number

1882

Author Connect URL

https://authorconnect.aisnet.org/conferences/AMCIS2024/papers/1882

Comments

SIGHEALTH

Author Connect Link

Share

COinS
Top 25 Paper Badge
 
Aug 16th, 12:00 AM

IMPACT OF DATASET SIZE AND TRANSFER LEARNING ON TRUNCATED LIGHTWEIGHT ARCHITECTURE

Lightweight CNN architectures, known for their efficiency and speed without compromising accuracy, play a crucial role in addressing the challenges posed by limited training data availability and high computational resource demands. These architectures seamlessly integrate with transfer learning, a pivotal technique in deep learning that allows pre-trained models to be adapted for new tasks. Our study focused on evaluating a multi-stage transfer learning approach across various dataset sizes and truncated versions of the MobileNetV2 architecture for medical image classification. The results reveal that simpler models can perform competitively, whereas more complex models generally deliver higher accuracy. Furthermore, the significance of model complexity on target performance tends to diminish with smaller datasets.

When commenting on articles, please be friendly, welcoming, respectful and abide by the AIS eLibrary Discussion Thread Code of Conduct posted here.