Skip to content

Auto-tuning of the hyperparameters for a Deep Neural Network using Meta-Heuristic Firefly Algorithm. Implemented DNNs using Tensorflow .

Notifications You must be signed in to change notification settings

yashasingh/Global-Search-Optimization

Repository files navigation

Global-Search-Optimization

UNDERSTANDING THE CODE

The following implementation can be divided into three modules-

  1. Firefly structure, definition and algorithm implementation.

  2. Designing a modular deep learning model which can be triggered dynamically by firefly algorithm.

  3. Implementation of parallel processing techniques for performance optimization.

For deep model optimization, we choose learning rate (referred by variable x), number of hidden layers (referred by variable y) and number of nodes in each layer (referred by variable z) as tuning hyperparameters. This results in a dynamic structure of fireflies since z is dependent on the number of hidden units.

All the designed models used Gradient Descent algorithm as the optimizer for weight optimization. Each layer used sigmoid function as the activation function and softmax cross-entropy in last layer for classifying labels.

About

Auto-tuning of the hyperparameters for a Deep Neural Network using Meta-Heuristic Firefly Algorithm. Implemented DNNs using Tensorflow .

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages