http://www.developer.com/

Back to article

Using JOONE for Artificial Intelligence Programming


November 21, 2002

Introduction

Few programmers have not been intrigued by Artificial Intelligence programming at one point or another. Many programmers who become interested in AI are quickly put off by the complexity of the algorithms involved. In this article, we will examine an open source project for Java that can simplify much of this complexity.

The Java Object Oriented Neural Network (JOONE) is an open source project that offers a highly adaptable neural network for Java programmers. The JOONE project source code is covered by a Lesser GNU Public License (LGPL). In a nutshell, this means that the source code is freely available and you need to pay no royalties to use JOONE. JOONE can be downloaded from http://joone.sourceforge.net/.

JOONE can allow you to create neural networks easily from a Java program. JOONE supports many features, such as multithreading and distributed processing. This means that JOONE can take advantage of multiprocessor computers and multiple computers to distribute the processing load.

Neural Networks

JOONE implements an artificial neural network in Java. An artificial neural network seeks to emulate the function of the biological neural network that makes up the brains found in nearly all higher life forms found on Earth. Neural networks are made up of neurons. A diagram of an actual neuron is shown in Figure 1.

Figure 1: A biological neuron

As you can see from Figure 1, the neuron is made up of a core cell and several long connectors, which are called synapses. These synapses are how the neurons are connected amongst themselves. Neural networks, both biological and artificial, work by transferring signals from neuron to neuron across the synapses.

Using JOONE

In this article, you will be shown a simple example of how to use JOONE. The topic of neural networks is very broad and covers many different applications. In this article, we will show you how to use JOONE to solve a very simple pattern recognition problem. Pattern recognition is a very common use for neural networks.

Pattern recognition presents the neural network with a pattern, to see whether the neural network is able to recognize that pattern. The pattern should be able to be distorted in some way and the neural network still is able to recognize it. This is similar to a human's ability to recognize something such as a traffic signal. The human should be able to recognize a traffic signal in the rain, daylight, or night. Even though each of these images looks considerably different, the human mind is able to determine that they are the same image.

When programming JOONE, you are generally working with two types of objects. You are given Neuron layer objects that represent a layer of one or more neuron that share similar characteristics. Neural networks usually will have either one or two layers of neurons. These layers are connected together by synapses. The synapses carry the pattern, which is to be recognized, from layer to layer.

Synapses do not just transmit the pattern from one neuron layer to the next. Synapses will develop biases towards elements of the pattern. These biases will cause certain elements of the pattern to be transmitted less effectively to the next layer than they would otherwise be. These biases, which are usually called weights, form the memory of the neural network. By adjusting the weights, which are stored in synapses, the behavior of the neural network is altered.

Synapses also play another role in JOONE. In JOONE, it is useful to think of synapses as data conduits. Just as synapses carry patterns from one neuron layer to another, specialized versions of the synapse are used to carry patterns both into and out of the neural network. You will now be shown how a simple single layer neural network can be constructed to recognize a pattern.

Training the Neural Network

For the purposes of the article, we will teach JOONE to recognize a very simple pattern. For this pattern, we will examine a binary boolean operation, such as XOR. The XOR operation's truth table is summarized below.

  X     Y     X XOR Y  
0 0 0
0 1 1
1 0 1
1 1 0

As you can see from the preceding table, the XOR operator will only be true, indicated by a value of one, when X and Y hold different values. In all other cases, the XOR operator evaluates to false, indicated by a zero. By default, JOONE takes its input from text files stored on your system. These text files are read by using a special synapse called the FileInputSynapse. To train for the XOR problem, you must construct an input file that contains the data shown above. This file is shown in Listing 1.

Listing 1: Input file for the XOR problem

0.0;0.0;0.0
0.0;1.0;1.0
1.0;0.0;1.0
1.0;1.0;0.0

We will now examine a simple program that teaches JOONE to recognize the XOR operation and produce the correct result. We will now examine the process that must be carried out to train the neural network. The process of training involves presenting the XOR problem to the neural network and observing the result. If the result is not what was expected, the training algorithm will adjust the weights, stored in the synapses. The difference between the actual output of the neural network and the anticipated output is called the error. Training will continue until the error falls below an acceptable level. This level is generally a percent, such as 10%. We will now examine the code that must be used to train a neural network.

The training process begins by setting up the neural network. The input, hidden, and output layers must all be created.

    // First, creates the three Layers
    input = new SigmoidLayer();
    hidden = new SigmoidLayer();
    output = new SigmoidLayer();

As you can see, each of the layers are created using the JOONE object SigmoidLayer. Sigmoid layers produce an output based on the natural logarithm. JOONE contains additional layers, other than the sigmoid layer type, that you may choose to use.

Next, each of these layers is given a name. These names will be helpful to later identify the layer during debugging.

    input.setLayerName("input");
    hidden.setLayerName("hidden");
    output.setLayerName("output");

Each layer must now be defined. We will specify the number of "rows" in each of the layers. This number of rows corresponds to the number of neurons in the layer.

    input.setRows(2);
    hidden.setRows(3);
    output.setRows(1);

As you can see from the preceding code, the input layer has two neurons, the hidden layer has three hidden neurons, and the output layer contains one neuron. It makes sense for the neural network to contain two input neurons and one output neuron because the XOR operator accepts two parameters and results in one value.

To make use of the neuron layers, we must also construct synapses. In this example, we will have several synapses. These synapses are crated with the following lines of code.

    // input -> hidden conn.
    FullSynapse synapse_IH = new FullSynapse();
    // hidden -> output conn.
    FullSynapse synapse_HO = new FullSynapse();

Just as was the case with the neuron layers, synapses can also be given names to assist in debugging. The following lines of code name the newly created synapses.

    synapse_IH.setName("IH");
    synapse_HO.setName("HO");

Finally, we must connect the synapses to the appropriate neuron layers. The following lines of code do this.

    // Connect the input layer with the hidden layer
    input.addOutputSynapse(synapse_IH);
    hidden.addInputSynapse(synapse_IH);

    // Connect the hidden layer with the output layer
    hidden.addOutputSynapse(synapse_HO);
    output.addInputSynapse(synapse_HO);

Now that the neural network has been created, we must create a Monitor object that will regulate the neural network. The following lines of code create the Monitor object.

    // Create the Monitor object and set the learning parameters
    monitor = new Monitor();
    monitor.setLearningRate(0.8);
    monitor.setMomentum(0.3);

The learning rate and momentum are parameters that are used to specify how the training will occur. JOONE makes use of the backpropagation learning method. For more information on the learning rate or the momentum, you should refer to the backpropagation algorithm.

This monitor object should be assigned to each of the neuron layers. The following lines of code do this.

    input.setMonitor(monitor);
    hidden.setMonitor(monitor);
    output.setMonitor(monitor);

Like many of the Java objects themselves, the JOONE monitor allows listeners to be added to it. As training progresses, JOONE will notify the listeners as to the progress of the training. For this simple example, we use:

    monitor.addNeuralNetListener(this);

We must now set up the input synapse. As previously mentioned, we will use a FileInputSynapse to read from a disk file. Disk files are not the only sort of input that JOONE can accept. JOONE is very flexible with regard to the input sources that it will accept. To cause JOONE to be able to accept other input types, you simply must create a new specialized synapse to accept the input. For this example, we will simply use the FileInputSynapse. The FileInputSynapse is first instantiated.

    inputStream = new FileInputSynapse();

Next, the FileInputSynapse must be informed of which columns to be used. The file shown in Listing 1 uses the first two columns as the inputs. The following lines of code set up the first two columns as the input to the neural network.

    // The first two columns contain the input values
    inputStream.setFirstCol(1);
    inputStream.setLastCol(2);

Next, we must provide the name to the input file. This name will come directly from the user interface. An edit control was provided to collect the name of the input file. The following lines of code set the filename for the FileInputSynapse.

    // This is the file that contains the input data
    inputStream.setFileName(inputFile.getText());

As previously mentioned, a synapse is just a conduit for data to travel between neuron layers. The FileInputSynapse is the conduit through which data enters the neural network. To facilitate this, we must add the FileInputSynapse to the input layer of the neural network. This is done by the following line.

    input.addInputSynapse(inputStream);

Now that the neural network is set up, we must create a trainer and monitor. The trainer is used to train the neural network because the monitor runs the neural network through a set number of training iterations. For each training iteration, data is presented to the neural network and the results are observed. The neural network's weights, which are stored in the synapse connection that go between the neuron layers, will be adjusted based on this error. As training progresses, this error level will drop. The following lines of code set up the trainer and attach it to the monitor.

    trainer = new TeachingSynapse();
    trainer.setMonitor(monitor);

You will recall that the input file provided in Listing 1 contains three columns. So far, we have only used the first two columns, which specify the input to the neural network. The third column contains the expected output when the neural network is presented with the numbers in the first column. We must provide the trainer access to this column so that the error can be determined. The error is the difference between the actual output of the neural network and this expected output. The following lines of code create another FileInputSynapse and prepare it to read from the same input file as before.

    // Setting of the file containing the desired responses,
    // provided by a FileInputSynapse
    samples = new FileInputSynapse();
    samples.setFileName(inputFile.getText());

This time, we would like to point the FileInputSynapse at the third column. The following lines of code do this and then set the trainer to use this FileInputSynapse.

    // The output values are on the third column of the file
    samples.setFirstCol(3);
    samples.setLastCol(3);
    trainer.setDesired(samples);

Finally, the trainer is connected to the output layer of the neural network. This will cause the trainer to receive the output of the neural network.

    // Connects the Teacher to the last layer of the net
    output.addOutputSynapse(trainer);

We are now ready to begin the background threads for all of the layers, as well as the trainers.

    input.start();
    hidden.start();
    output.start();
    trainer.start();

Finally, we set some parameters for the training. We specify that there are four rows in the input file, that we would like to train for 20,000 cycles, and that we are learning. If you set the learning parameter to false, the neural network would simply process the input and not learn. We will cover input processing in the next section.

    monitor.setPatterns(4);
    monitor.setTotCicles(20000);
    monitor.setLearning(true);

We are now ready to begin the training process. Calling the Go method of the monitor will start the training process in the background.

    monitor.Go();

The neural network will now be trained for 20,000 cycles. When the neural network is finished training, the error level should now be at a reasonably low level. An error level below 10% is acceptable.

Running the Neural Network

Now that the neural network has been trained, we can test it by presenting the input patterns to the neural network and observing the results. The method used to run the neural network must first prepare the neural network to process data. Currently, the neural network is in a training mode. To begin with, we will remove the trainer from the output layer. We will replace the trainer with an FileOutputSynapse so that we can record the output from the neural network. The following lines of code do this.

    output.removeOutputSynapse(trainer);
    FileOutputSynapse results = new FileOutputSynapse();
    results.setFileName(resultFile.getText());

Now we must reset the input stream. We will use the same file input stream that we used during training. This will feed the same inputs that were used during training to the neural network.

    inputStream.resetInput();
    samples.resetInput();
    results.setMonitor(monitor);
    output.addOutputSynapse(results);

Next, we must restart all of the threads that correspond to the neural network.

    input.start();
    hidden.start();
    output.start();
    trainer.start();

Now that the threads have been restarted, we must set some basic configuration information for the recognition. The following lines of code do this.

    monitor.setPatterns(4);
    monitor.setTotCicles(1);
    monitor.setLearning(false);

First, the number of input patterns is set to four. This is because we want the neural network to process each of the four input patterns that you originally used to train the neural network. Finally, the mode is set to learning. With this completed, we can call the "Go" method of the monitor.

    monitor.Go(); 

When training completes, you will see that the output file produces will be similar to Listing 2.

Listing 2: The output from the neural network

0.012549763955262739
0.9854631848890223
0.9853159647305264
0.01783622084836082

You can see that the first line from the listing is a number which is reasonably close to zero. This is good because the first line of the input training file, as seen in Listing 1, was supposed to result in zero. Similarly, the second line is reasonably close to one, which is also good because the second line in the training file was also supposed to produce one.

Conclusion

As you can see, the JOONE engine encapsulates much of the complexities of neural network programming. The example presented here shows the basic process by which a neural network can be used. Though real-world implementations of neural networks will be much more complex, the basic process remains the same. Data is presented to the neural network for training, and then new patterns are presented for recognition. The example program provides a good starting point for exploration with JOONE. The complete example source file is shown in Listing 3.

Listing 3: The complete example

import javax.swing.*;
import java.awt.*;
import java.awt.event.*;

import org.joone.engine.*;
import org.joone.engine.learning.*;
import org.joone.io.*;

/**
 * Example: The XOR Problem with JOONE
 *
 * @author Jeff Heaton
 * @version 1.0
 */
public class XorExample extends JFrame implements
ActionListener,NeuralNetListener {

FullSynapse t1,t2;

  /**
   * The train button.
   */
  JButton btnTrain;

  /**
   * The run button.
   */
  JButton btnRun;

  /**
   * The quit button.
   */
  JButton btnQuit;

  /**
   * The name of the input file.
   */
  JTextField inputFile;

  /**
   * The name of the result file.
   */
  JTextField resultFile;

  /**
   * The status line.
   */
  JLabel status;

  /**
   * Constructor. Set up the components.
   */
  public XorExample()
  {
    setTitle("XOR Solution");

    Container content = getContentPane();

    GridBagLayout gridbag = new GridBagLayout();
    GridBagConstraints c = new GridBagConstraints();
    content.setLayout(gridbag);

    c.fill = GridBagConstraints.NONE;
    c.weightx = 1.0;

    // Training input label
    c.gridwidth = GridBagConstraints.REMAINDER; //end row
    c.anchor = GridBagConstraints.NORTHWEST;
    content.add(
               new JLabel(
                         "Enter the name of the training input
                          file:"),c);

    // Training input filename
    c.gridwidth = GridBagConstraints.REMAINDER; //end row
    c.anchor = GridBagConstraints.NORTHWEST;
    content.add(
               inputFile = new JTextField(40),c);
    inputFile.setText("./train.txt");

    // Training input label
    c.gridwidth = GridBagConstraints.REMAINDER; //end row
    c.anchor = GridBagConstraints.NORTHWEST;
    content.add(
               new JLabel("Enter the name of the result file:")
                           ,c);

    // Training input filename
    c.gridwidth = GridBagConstraints.REMAINDER; //end row
    c.anchor = GridBagConstraints.NORTHWEST;
    content.add(
               resultFile = new JTextField(40),c);
    resultFile.setText("./result.txt");

    // the button panel
    JPanel buttonPanel = new JPanel(new FlowLayout());
    buttonPanel.add(btnTrain = new JButton("Train"));
    buttonPanel.add(btnRun = new JButton("Run"));
    buttonPanel.add(btnQuit = new JButton("Quit"));
    btnTrain.addActionListener(this);
    btnRun.addActionListener(this);
    btnQuit.addActionListener(this);

    // Add the button panel
    c.gridwidth = GridBagConstraints.REMAINDER; //end row
    c.anchor = GridBagConstraints.CENTER;
    content.add(buttonPanel,c);

    // Training input label
    c.gridwidth = GridBagConstraints.REMAINDER; //end row
    c.anchor = GridBagConstraints.NORTHWEST;
    content.add(
               status = new JLabel("Click train to begin
                                    training..."),c);




    // adjust size and position
    pack();
    Toolkit toolkit = Toolkit.getDefaultToolkit();
    Dimension d = toolkit.getScreenSize();
    setLocation(
               (int)(d.width-this.getSize().getWidth())/2,
               (int)(d.height-this.getSize().getHeight())/2 );
    setDefaultCloseOperation(WindowConstants.DISPOSE_ON_CLOSE);
    setResizable(false);
  }

  /**
   * The main function, just display the JFrame.
   * 
   * @param args No arguments are used.
   */
  public static void main(String args[])
  {
    (new XorExample()).show(true);
  }

  /**
   * Called when the user clicks one of the three
   * buttons.
   * 
   * @param e The event.
   */
  public void actionPerformed(ActionEvent e)
  {
    if ( e.getSource()==btnQuit )
      System.exit(0);
    else if ( e.getSource()==btnTrain )
      train();
    else if ( e.getSource()==btnRun )
      run();
  }

  /**
   * Called when the user clicks the run button.
   */
  protected void run()
  {
    output.removeOutputSynapse(trainer);

    inputStream.resetInput();
    samples.resetInput();
    FileOutputSynapse results = new FileOutputSynapse();
    results.setFileName(resultFile.getText());
    results.setMonitor(monitor);
    output.addOutputSynapse(results);

    input.start();
    hidden.start();
    output.start();
    trainer.start();

    // number of rows (patterns) contained in the input file
    monitor.setPatterns(4);
    // How many times the net must be trained on the input
    // patterns
    monitor.setTotCicles(1);
    // The net must be trained
    monitor.setLearning(false);
    // The net starts the training job
    monitor.Go();
    status.setText("Results written to " + resultFile.getText());
  }

  /**
   * The input layer of neurons.
   */
  SigmoidLayer input;

  /**
   * The hidden layer of neurons.
   */
  SigmoidLayer hidden;

  /**
   * The output layer of neurons.
   */
  SigmoidLayer output;

  /**
   * The monitor. Used to pass parameters to all of the
   * JOONE objects.
   */
  Monitor monitor;

  /**
   * The file input stream.
   */
  FileInputSynapse inputStream;

  /**
   * Used to train the neural network.
   */
  TeachingSynapse trainer;

  /**
   * The training data.
   */
  FileInputSynapse samples;


  /**
   * Called when the user clicks the train button.
   */
  protected void train()
  {
    // First, creates the three Layers
    input = new SigmoidLayer();
    hidden = new SigmoidLayer();
    output = new SigmoidLayer();
    input.setLayerName("input");
    hidden.setLayerName("hidden");
    output.setLayerName("output");

    // sets their dimensions
    input.setRows(2);
    hidden.setRows(3);
    output.setRows(1);

    // Now create the two Synapses
    // input -> hidden conn.
    FullSynapse synapse_IH = new FullSynapse();
    // hidden -> output conn.
    FullSynapse synapse_HO = new FullSynapse();

    synapse_IH.setName("IH");
    synapse_HO.setName("HO");
    t1=synapse_IH;
    t2=synapse_HO;


    // Connect the input layer with the hidden layer
    input.addOutputSynapse(synapse_IH);
    hidden.addInputSynapse(synapse_IH);

    // Connect the hidden layer with the output layer
    hidden.addOutputSynapse(synapse_HO);
    output.addInputSynapse(synapse_HO);

    // Create the Monitor object and set the learning parameters
    monitor = new Monitor();
    monitor.setLearningRate(0.8);
    monitor.setMomentum(0.3);

    // Pass the Monitor to all components
    input.setMonitor(monitor);
    hidden.setMonitor(monitor);
    output.setMonitor(monitor);

    // The application registers itself as monitor's listener
    // so it can receive the notifications of termination from
    // the net.

    monitor.addNeuralNetListener(this);

    inputStream = new FileInputSynapse();

    // The first two columns contain the input values
    inputStream.setFirstCol(1);
    inputStream.setLastCol(2);

    // This is the file that contains the input data
    inputStream.setFileName(inputFile.getText());
    input.addInputSynapse(inputStream);


    trainer = new TeachingSynapse();
    trainer.setMonitor(monitor);

    // Setting of the file containing the desired responses,
    // provided by a FileInputSynapse
    samples = new FileInputSynapse();
    samples.setFileName(inputFile.getText());

    // The output values are on the third column of the file
    samples.setFirstCol(3);
    samples.setLastCol(3);
    trainer.setDesired(samples);

    // Connects the Teacher to the last layer of the net
    output.addOutputSynapse(trainer);

    // All the layers must be activated invoking their method
    // start; the layers are implemented as Runnable objects, then
    // they are allocated on separated threads. The threads will
    // stop after training and will need to be restarted later.
    input.start();
    hidden.start();
    output.start();
    trainer.start();

    // number of rows (patterns) contained in the input file
    monitor.setPatterns(4); 
    // How many times the net must be trained on the input
    // patterns
    monitor.setTotCicles(20000);
    // The net must be trained
    monitor.setLearning(true);
    // The net starts the training job
    monitor.Go();
  }

  /**
   * JOONE Callback: called when the neural network
   * stops. Not used.
   * 
   * @param e The JOONE event
   */
  public void netStopped(NeuralNetEvent e) {
  }

  /**
   * JOONE Callback: called to update the progress
   * of the neural network. Used to update the
   * status line.
   * 
   * @param e The JOONE event
   */
  public void cicleTerminated(NeuralNetEvent e) {
    Monitor mon = (Monitor)e.getSource();
    long c = mon.getCurrentCicle();
    long cl = c / 1000;
    // print the results every 1000 cycles
    if ( (cl * 1000) == c )
      status.setText(c + " cycles remaining - Error = "
                         + mon.getGlobalError());
  }

  /**
   * JOONE Callback: Called when the network
   * is starting up. Not used.
   * 
   * @param e The JOONE event.
   */
  public void netStarted(NeuralNetEvent e) {
  }

}

About the Author

Jeff Heaton is the author of Programming Spiders, Bots, and Aggregators in Java (Sybex, 2002). Jeff works as a software designer for the Reinsurance Group of America. Jeff has written three books and numerous magazine articles about computer programming. Jeff may be contacted through his Web site, http://www.jeffheaton.com.

Sitemap | Contact Us

Thanks for your registration, follow us on our social networks to keep up-to-date