genann/README.md

140 lines
4.9 KiB
Markdown
Raw Normal View History

2016-03-08 21:51:05 +03:00
[![Build Status](https://travis-ci.org/codeplea/genann.svg?branch=master)](https://travis-ci.org/codeplea/genann)
2016-02-11 23:38:42 +03:00
#Genann
2016-02-10 02:53:54 +03:00
2016-02-11 23:38:42 +03:00
Genann is a very minimal library for training and using feedforward artificial neural
2016-02-10 02:53:54 +03:00
networks (ANN) in C. Its primary focus is on being simple, fast, and hackable. It achieves
this by providing only the necessary functions and little extra.
##Features
- **ANSI C with no dependencies**.
- Contained in a single source code and header file.
- Simple.
- Fast and thread-safe.
- Easily extendible.
- Implements backpropagation training.
- Compatible with training by alternative methods (classic optimization, genetic algorithms, etc)
- Includes examples and test suite.
- Released under the zlib license - free for nearly any use.
##Example Code
Four example programs are included.
- `example1.c` - Trains an ANN on the XOR function using backpropagation.
- `example2.c` - Trains an ANN on the XOR function using random search.
- `example3.c` - Loads and runs an ANN from a file.
- `example4.c` - Trains an ANN on the [IRIS data-set](https://archive.ics.uci.edu/ml/datasets/Iris) using backpropagation.
##Quick Example
Here we create an ANN, train it on a set of labeled data using backpropagation,
ask it to predict on a test data point, and then free it:
```C
#include "genann.h"
2016-02-10 04:16:18 +03:00
/* Not shown, loading your training and test data. */
double **training_data_input, **training_data_output, **test_data_input;
2016-02-10 02:53:54 +03:00
/* New network with 5 inputs,
* 2 hidden layer of 10 neurons each,
* and 1 output. */
2016-02-11 23:38:42 +03:00
genann *ann = genann_init(5, 2, 10, 1);
2016-02-10 02:53:54 +03:00
/* Learn on the training set. */
for (i = 0; i < 300; ++i) {
for (j = 0; j < 100; ++j)
genann_train(ann, training_data_input[j], training_data_output[j], 0.1);
}
/* Run the network and see what it predicts. */
printf("Output for the first test data point is: %f\n", *genann_run(ann, test_data_input[0]));
genann_free(ann);
```
Not that this example is to show API usage, it is not showing good machine
learning techniques. In a real application you would likely want to learn on
the test data in a random order. You would also want to monitor the learning to
prevent over-fitting.
##Usage
###Creating and Freeing ANNs
```C
2016-02-11 23:38:42 +03:00
genann *genann_init(int inputs, int hidden_layers, int hidden, int outputs);
genann *genann_copy(genann const *ann);
void genann_free(genann *ann);
2016-02-10 02:53:54 +03:00
```
Creating a new ANN is done with the `genann_init()` function. It's arguments
are the number of inputs, the number of hidden layers, the number of neurons in
2016-02-11 23:38:42 +03:00
each hidden layer, and the number of outputs. It returns a `genann` struct pointer.
2016-02-10 02:53:54 +03:00
2016-02-11 23:38:42 +03:00
Calling `genann_copy()` will create a deep-copy of an existing `genann` struct.
2016-02-10 02:53:54 +03:00
Call `genann_free()` when you're finished with an ANN returned by `genann_init()`.
###Training ANNs
```C
2016-02-11 23:38:42 +03:00
void genann_train(genann const *ann, double const *inputs,
2016-02-10 07:09:21 +03:00
double const *desired_outputs, double learning_rate);
2016-02-10 02:53:54 +03:00
```
`genann_train()` will preform one update using standard backpropogation. It
should be called by passing in an array of inputs, an array of expected output,
and a learning rate. See *example1.c* for an example of learning with
backpropogation.
2016-02-11 23:38:42 +03:00
A primary design goal of Genann was to store all the network weights in one
2016-02-10 02:53:54 +03:00
contigious block of memory. This makes it easy and efficient to train the
2016-02-10 04:16:18 +03:00
network weights using direct-search numeric optimizion algorthims,
2016-02-10 02:53:54 +03:00
such as [Hill Climbing](https://en.wikipedia.org/wiki/Hill_climbing),
[the Genetic Algorithm](https://en.wikipedia.org/wiki/Genetic_algorithm), [Simulated
Annealing](https://en.wikipedia.org/wiki/Simulated_annealing), etc.
These methods can be used by searching on the ANN's weights directly.
2016-02-11 23:38:42 +03:00
Every `genann` struct contains the members `int total_weights;` and
2016-02-10 02:53:54 +03:00
`double *weight;`. `*weight` points to an array of `total_weights`
size which contains all weights used by the ANN. See *example2.c* for
an example of training using random hill climbing search.
###Saving and Loading ANNs
```C
2016-02-11 23:38:42 +03:00
genann *genann_read(FILE *in);
void genann_write(genann const *ann, FILE *out);
2016-02-10 02:53:54 +03:00
```
2016-02-11 23:38:42 +03:00
Genann provides the `genann_read()` and `genann_write()` functions for loading or saving an ANN in a text-based format.
2016-02-10 02:53:54 +03:00
###Evaluating
```C
2016-02-11 23:38:42 +03:00
double const *genann_run(genann const *ann, double const *inputs);
2016-02-10 02:53:54 +03:00
```
Call `genann_run()` on a trained ANN to run a feed-forward pass on a given set of inputs. `genann_run()`
will provide a pointer to the array of predicted outputs (of `ann->outputs` length).
2016-02-10 07:09:21 +03:00
##Hints
- All functions start with `genann_`.
- The code is simple. Dig in and change things.
2016-02-10 02:53:54 +03:00
##Extra Resources
The [comp.ai.neural-nets
FAQ](http://www.faqs.org/faqs/ai-faq/neural-nets/part1/) is an excellent
resource for an introduction to artificial neural networks.
If you're looking for a heavier, more opinionated neural network library in C,
I highly recommend the [FANN library](http://leenissen.dk/fann/wp/). Another
good library is Peter van Rossum's [Lightweight Neural
Network](http://lwneuralnet.sourceforge.net/), which despite its name, is
2016-02-11 23:38:42 +03:00
heavier and has more features than Genann.