Loading...

Media is loading
 

Paper Type

Complete

Description

This study explored different hyperparameters and their outcomes on multiple DNN architectures by incorporating transfer learning technique on the plant seedling dataset. Optimizers can have different strengths and weaknesses, and their performance may depend on the specific characteristics of the model and the dataset being used. We have observed that the choice of the optimizer is just one of many hyperparameters that can affect the performance of deep neural network (DNN) models. Other hyperparameters, such as the droprate, number of epochs, and batch size, can also have a significant impact. MobileNetV2 demonstrated superior performance while maintaining a smaller model size, making it a highly valuable option for edge devices where size is a crucial factor.

Paper Number

1319

Comments

SIG ODIS

Share

COinS
 
Aug 10th, 12:00 AM

Deep Neural Network at Edge: An Exploration of Hyper Parameter Tuning on Plant Seedling Dataset

This study explored different hyperparameters and their outcomes on multiple DNN architectures by incorporating transfer learning technique on the plant seedling dataset. Optimizers can have different strengths and weaknesses, and their performance may depend on the specific characteristics of the model and the dataset being used. We have observed that the choice of the optimizer is just one of many hyperparameters that can affect the performance of deep neural network (DNN) models. Other hyperparameters, such as the droprate, number of epochs, and batch size, can also have a significant impact. MobileNetV2 demonstrated superior performance while maintaining a smaller model size, making it a highly valuable option for edge devices where size is a crucial factor.

When commenting on articles, please be friendly, welcoming, respectful and abide by the AIS eLibrary Discussion Thread Code of Conduct posted here.