Skip to content

A java library providing a configurable neural network. Supports supervised learning and genetic algorithm.

License

Notifications You must be signed in to change notification settings

lpapailiou/neuralnetwork

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

architecture

neural network

This is a maven library for neural networks in java 8.
Different visualization options are provided with javafx.

Table of Contents

  1. About
  2. Scope
    2.1 Architecture
    2.2 Supported algorithms
    2.3 Initializers
    2.4 Rectifiers
    2.5 Cost functions
    2.6 Optimizers
    2.7 Parametrization
    2.8 Persistence
    2.9 UI
  3. Samples
    3.1 Overview
    3.2 Walkthrough
  4. Implementation
  5. Releases
  6. References

About

The motivation for this library originated in the study project snakeML, where the game snake was to be solved by neural networks. The rudimentary approach was extracted to this library and improved over time to study both neural networks and play around with java.
The goal of this library is to make neural networks easily accessible in terms of 'how does it work' and to provide an easy-to-use and plug-and-play-tool for other projects.
While the first steps focused on functionality, later work focused on different approaches of visualization with javafx.

Scope

Architecture

  • fully connected
  • none to many hidden layers supported
  • node count per layer is configurable

Supported algorithms

  • Supervised learning
  • Genetic algorithm

Initializers

  • Static
  • Random
  • Xavier
  • Kaiming

Rectifiers

Implemented are following rectifiers:

  • Identity
  • RELU
  • Leaky RELU
  • Sigmoid
  • Sigmoid (accurate)
  • SILU
  • SILU (accurate)
  • TANH
  • ELU
  • GELU
  • Softplus
  • Softmax

Cost functions

  • MSE native
  • MSE
  • Cross entropy
  • Exponential
  • Hellinger distance
  • KLD
  • GKLD
  • ISD

Optimizers

  • none (static)
  • stochastic gradient descent (applicable to learning rate and mutation rate)
  • dropout

Parametrization

The parametrization of the hyperparameters of the neural network can be done as following:

  • programmatically
  • by neuralnetwork.properties in case default values are used constantly

Persistence

  • neural network instances are fully serializable

UI

With the additional ui package, you may be able to visualize the neural network and additional metrics interactively based on the javafx framework.

Samples

Overview

In order to have an idea about the look and feel, see following samples which were created with this library.

Sample code for a minimal prediction task:

double[] in = {1,0.5};                                              // input values
NeuralNetwork neuralNetwork = new NeuralNetwork.Builder(2, 4, 1)
    .setDefaultRectifier(Rectifier.SIGMOID)
    .setLearningRate(0.8)
    .build();                                                       // initialization
List<Double> out = neuralNetwork.predict(in);                       // prediction

Live visualization of a predicting neural network:

architecture

Line charts of available rectifiers:

rectifiers

Visualization of weights of a layer per node and overall, trained on mnist:

layer weights

Confusion matrix visualization:

confusion matrix

Binary decision boundaries in 2D and 3D, manually refreshed while training on xor dataset:

binary decision boundaries

Multiclass decision boundaries in 3D, animated:

multiclass decision boundaries

Fully integrated sample ui for rudimentary analysis of tsp problem:

multiclass decision boundaries

Walkthrough

Detailed examples are available here:

Topic Description
basic usage constructor, methods, basic features
supervised learning implementation example
genetic algorithm implementation example
visualization charts, decision boundaries, confusion matrix, layer weights etc. in 2D and 3D

Implementation

This library can be either implemented by jar file or as maven dependency.
Detailed instructions are documented here.

Releases

As this project started 'fun project' and the concept of 'free time' is more a fairy tale than reality, there is not a proper version control (yet).
In general, the neural netowrk algorithm is quite stable, no big changes are to be expected soon.
Before new features are introduced, a stable, consistent realease will be made.

Release Description
upcoming stable, consistent release, focusing on consistency
3.1 mostly minor fixes and features added, currently treated as snapshot
3.0 introduction of charts and other visualizations, lot of refactoring
<= 2.5 multiple releases focusing on the neural network algorithm

References

Code:

Literature: