Jump to content

Neural network from first principles - 2


Recommended Posts

  • Moderators

The previous post laid out the the structure of a basic neural network.

The example used was a 4 layer network with 12 neurons

Setting it up like that there is enough information to dynamically generate a scaled version on a paintbox  :

 

Network.JPG

 

 

Clicking on any node on this canvas gives a messagebox with the salient details of that particular neuron.

(Clicked neuron 4).

procedure TNeuralNetwork.InitializeObject;
begin
  inherited;
 
...
 
  PaintBox1 := TW3PaintBox.Create(self);
 
  self.Handle.ReadyExecute( procedure ()
  begin
    if (csReady in ComponentState) then
    begin
      browserapi.document.addEventListener("click", procedure()
      begin
        var a : string := '';
        For var i := 0 to Neurons.Count-1 do
        begin
          If (browserapi.event.clientX > Neurons[i].NeuronCoord.X + self.PaintBox1.left)      and
             (browserapi.event.clientX < Neurons[i].NeuronCoord.X + self.PaintBox1.left + 40) and
             (browserapi.event.clientY > Neurons[i].NeuronCoord.Y + self.PaintBox1.top)       and
             (browserapi.event.clientY < Neurons[i].NeuronCoord.Y + self.PaintBox1.top  + 40) then
          begin
            a := 'Neuron ' + inttostr(i+1) + ' (' + Neurons[i].NeuronType + ' neuron)' + chr(10) +
                 'Output = ' + floattoStr(Neurons[i].NeuronOutPut,3) + chr(10) +
                 'Error  = '  + floattoStr(Neurons[i].NeuronError,3);
            For var j := 0 to Weights.Count -1 do begin
              If Weights[j].NeuronFrom = Neurons[i] then begin
                For var k := 0 to Neurons.Count-1 do begin
                  If Neurons[k] = Weights[j].NeuronTo then begin
                    a := a + chr(10) + 'Weight to ' + inttostr(k+1) + ' : ' +
                      floattostr(Weights[j].Weight,3)
                  end;
                end;
              end;
            end;
            ShowMessage(a);
          end;
        end;
      end);
//
      w3_callback( procedure ()
        begin
          ShowNetWork;
        end,50);
      end;
    end);
//
end;
 

Next is to start training.

Training is done by providing a number of examples. The network takes the input values of each example and calculates an output based on the topology and the weights assigned to the connections between the neurons. This is the feed-forward part.

Once an output is calculated, it is compared to the desired output from the example and all the weights are recalculated and adjusted. This is the back-propagation part of the training.

 

For simplicity sake training data is kept in an in-memory record structure.

Obviously when the training data reaches billions of records, another way of handling needs to be implemented.

  TTrainingRecord = Class(TObject)
  public
    &inputs  : array of float;
    &outputs : array of float;
  end;
 
  TTrainingSet = Class(TObject)
  public
    TrainingRecords : Array of TTrainingRecord;
  end;
 
Procedure TNeuralNetwork.AddTrainingData (Inputs: Array of float; Outputs: Array of float);
begin
//
  TrainingRecord.Destroy;
  TrainingRecord := TTrainingRecord.Create;
  TrainingRecord.&inputs  := Inputs;
  TrainingRecord.&outputs := Outputs;
  TrainingSet.TrainingRecords.Add(TrainingRecord);
end;
 


   MyNetwork.AddTrainingData ([0, 0], [0]);           // 2 input, 1 output
   MyNetwork.AddTrainingData ([1, 0], [1]);
   MyNetwork.AddTrainingData ([0, 1], [1]);
   MyNetwork.AddTrainingData ([1, 1], [0]);
   MyNetwork.Train;
 

The following code executes a training session.

Step 1

Calculate the output values of the input neurons by simply copying the input values.

 

Step 2

For all other neurons calculate the output as the sum of the incoming connections (weight * value) and pass this sum through the activation function.

So for instance the output of neuron 4 is :  

N4 = (output N1 * W1~4) + (output N2 * W2~4) + (output N3*W3~4)  and then calculate the sigmoid function as 1/(1+e^-N4) which gives the new output of neuron 4.

 

Step 3

Once the output for the output neurons has been calculated, define the network error simply as the difference between the calculated output of the output neurons and the desired output as provided by the training record

 

Step 4

The network error needs to be distributed over all neurons, working the way back from output to input.

Back-propagate the network error proportional to the incoming weights.

So for instance error N8 = W8~12  /  (W8~12 + W9~12 + W10~12 + W11~12) and multiply that by the error of N12

Procedure TNeuralNetwork.Train;
begin
//                                                           //   temp : 4 should be 1
  For var f := 1 to TrainingSet.TrainingRecords.Count - 3 do begin     // for each training record
//
//  handle input layer
//
    var d := 0;                                              // d = index of input neuron
    For var i := 0 to Neurons.Count -1 do begin
      If Neurons[i].Layer.LayerType = 'Input' then begin     // copy input values to layer[0].neuronoutput
        Neurons[i].NeuronOutput := TrainingSet.TrainingRecords[f].&inputs[d];
        d := d + 1;
        If Neurons[i].NeuronType = 'Bias' then               // except for bias where output = 1
          Neurons[i].NeuronOutput := 1;
      end;
    end;
//
//  handle subsequent layers
//
    For var j := 1 to Layers.Count -1 do begin
      For var k := 0 to Neurons.Count - 1 do begin
        If Neurons[k].Layer = Layers[j] then begin
          For var m := 0 to Weights.Count -1 do begin
            If Weights[m].NeuronTo = Neurons[k] then begin
              Neurons[k].NeuronOutput :=
                Neurons[k].NeuronOutput + (Weights[m].Weight * Weights[m].NeuronFrom.NeuronOutput);
            end;
          end;
          If Neurons[k].Layer.ActivationType = 'Sigmoid' then begin
            Neurons[k].NeuronOutput :=    // 1/(1+e^-x)
              1/(1+power(exp(1.0),-Neurons[k].NeuronOutput));
//          If Neurons[k].NeuronType = 'Bias' then             // except for bias where output = 1
//            Neurons[k].NeuronOutput := 1;
          end;
        end;
      end;
    end;
//
//  calculate error = training output(s) - output(s) in last layer
//
    d := 0;                                         // d = index of output neuron
    For var i := 0 to Neurons.Count -1 do begin
      If Neurons[i].Layer.LayerType = 'Output' then begin
        ShowMessage('Output : ' + FloatToStr(Neurons[i].NeuronOutput));
        Neurons[i].NeuronError :=
          TrainingSet.TrainingRecords[f].&outputs[d] - Neurons[i].NeuronOutput;
        ShowMessage('Error : ' + FloatToStr(Neurons[i].NeuronError));
        d := d + 1;
      end;
    end;
//
//  back-propagate error
//
    var WeightTot : float := 0;
    For var p := Layers.Count -1 downto 1 do begin              //start with output layer
      For var q := Neurons.Count -1 downto 0 do begin
        If Neurons[q].Layer = Layers[p] then begin
          WeightTot := 0;                                       //calculate total weights into neuron
          For var r := Weights.Count -1 downto 0 do begin
            If Weights[r].NeuronTo = Neurons[q] then begin
              WeightTot := WeightTot + Weights[r].Weight;
            end;
          end;
          For var r := Weights.Count -1 downto 0 do begin       //calculate proportional error
            If Weights[r].NeuronTo = Neurons[q] then begin
              For var s := Neurons.Count -1 downto 0 do begin
                If Weights[r].NeuronFrom = Neurons[s] then begin
                  Neurons[s].NeuronError :=
                    (Weights[r].Weight / WeightTot) * Neurons[q].NeuronError;
//                  ShowMessage('Error : ' + IntToStr(q) + '-' + IntToStr(s) + ' ' + FloatToStr(Neurons[s].NeuronError));
                end;
              end;
            end;
          end;
        end;
      end;
    end;
  end;
//
end;
 

Based on this the only other thing to do is to adjust the weights of the network, which I still have to figure out

The result including weights adjustments are viewable here  

Next post

NNetwork.jpg

Link to post
Share on other sites
Thanks for sharing. I just read a article at http://www.blaisepascal.eu/about AI and Neural network application
 

Just came across this info and thought it significant enough to share here.

 

We are prowd to announce our newest project: Delphi/Pascal Compiler Pascal to Javascript 

presentation on the PASCAL - DELPHI FESTIVAL 
20 September 2016 

PAS2JS  
Command line compiler, translates Pascal (subset of Delphi/ObjFPC syntax) to JavaScript.

Recursively compiles a Pascal program and its used units, similar to Delphi compiler and FPC. Ability to directly use JavaScript from Pascal. e.g. access the Browser DOM.

Planned Features

- DELPHI integration
- LAZARUS integration
- create project 
- compiler options
- compile
- jump to error
- code navigation 
- debug using Delphi and a browser
- Component Libraries
- IDE form designer add-on

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...