Difference between revisions of "HNN"

From HaskellWiki
Jump to navigation Jump to search
(Creation of this page)
(No difference)

Revision as of 12:53, 23 December 2009

Description

HNN (stands for Haskell Neural Network library) is an attempt at providing a simple but powerful and efficient library to deal with feed-forward neural networks in Haskell.

Why another neural network library ?

I tried HFANN and few other code I found before deciding to write my own library. But I wasn't satisfied with them. THey are more comprehensive but they were not to be used the way I intended to, for a Haskell neural network library. Mine is much simpler, less comprehensive but is an attempt at easily creating, training and using neural networks in Haskell, without performance losses. Note : HNN is full-Haskell unlike HFANN which is a binding to a C library.

Get the code

From Hackage

hnn-0.1 will be on Hackage in the upcoming days, in the AI category. At that moment, a simple :

  • cabal install hnn

should install HNN for you. (You may need to do a cabal update before so that cabal will be aware of the new package named hnn -- and of course of the other new packages or new versions of the packages)


From the git repository

HNN is hosted on github : [1] The instructions to get it and build it are :

After these commands, provided you also have the 'base' (>= 3 && <= 5) and 'uvector' packages installed, HNN will be installed just like any other library. To generate the documentation, you have to execute :

  • cabal haddock

The documentation should then be in HNN/dist/doc/.

You can see the xor-3input.hs file for an example of use of the library (see HNN#Example on this page).

Documentation

There is an online version of the documentation here : [2] but there should be the same doc soon on the hackage page of hnn.

Example

Here is a simple example of use of the HNN library.

xor-3inputs.hs file :

module Main where
  
import AI.HNN.Net
import AI.HNN.Layer
import AI.HNN.Neuron
import Data.Array.Vector
import Control.Arrow
import Data.List
  
alpha = 0.8 :: Double -- learning ratio
epsilon = 0.001 :: Double -- desired maximal bound for the quad error

layer1, layer2 :: [Neuron]

layer1 = createSigmoidLayer 4 0.5 [0.5, 0.5, 0.5] -- the hidden layer

layer2 = createSigmoidLayer 1 0.5 [0.5, 0.4, 0.6, 0.3] -- the output layer
 
net = [layer1, layer2] -- the neural network
  
finalnet = train alpha epsilon net [([1, 1, 1],[0]), ([1, 0, 1],[1]), ([1, 1, 0],[1]), ([1, 0, 0],[0])] -- the trained neural network

good111 = computeNet finalnet [1, 1, 1]
good101 = computeNet finalnet [1, 0, 1]
good110 = computeNet finalnet [1, 1, 0]
good100 = computeNet finalnet [1, 0, 0]

main = do
     putStrLn $ "Final neural network : \n" ++ show finalnet
     putStrLn " ---- "
     putStrLn $ "Output for [1, 1, 1] (~ 0): " ++ show good111
     putStrLn $ "Output for [1, 0, 1] (~ 1): " ++ show good101
     putStrLn $ "Output for [1, 1, 0] (~ 1): " ++ show good110
     putStrLn $ "Output for [1, 0, 0] (~ 0): " ++ show good100

Compile it with ghc -O2 --make xor-3inputs.hs -o xor-3 and launch it. You should get something close to the following.


$ ./xor-3inputs 
Final neural network : 
[[Threshold : 0.5
Weights : toU [1.30887603787326,1.7689534867644316,2.2908214981696453],Threshold : 0.5
Weights : toU [-2.4792430791673947,4.6447786039112655,-4.932860802255383],Threshold : 0.5
Weights : toU [2.613377735822592,6.793687725768354,-5.324081206358496],Threshold : 0.5
Weights : toU [-2.5134194114492585,4.730152273922408,-5.021321916827272]],[Threshold : 0.5
Weights : toU [4.525235803191061,4.994126671590998,-8.2102354168462,5.147655509585701]]]
 ---- 
Output for [1, 1, 1] (~ 0): [2.5784449476436315e-2]
Output for [1, 0, 1] (~ 1): [0.9711209812630944]
Output for [1, 1, 0] (~ 1): [0.9830499812666017]
Output for [1, 0, 0] (~ 0): [1.4605247804272069e-2]

Feedback, participation and all

If you have anything to say related to the HNN library, please send an email to alpmestan <a t> gmail <d o t> com. I'd be pleased to hear from any HNN user so don't hesitate ! Thank you.