dupa

Author Topic: SPYDAZ- Report's out there were no WMPs  (Read 17122 times)

brianstorm

  • Jr. Member
  • **
  • Posts: 76
    • View Profile
SPYDAZ- Report's out there were no WMPs
« on: October 12, 2004, 05:04:33 pm »
WMP's = Weapons O' Mass Perceptrons  >B)

In the spirit of the persuit of THE REAL AI, I present

the prototype HalBraincell, which is a neural-net simulation

originally written in Java and I have attempted to convert to

VBScript. While it is indeed running inside the Hal engine

without producing a syntactical error,

something is going awry on a procedural level that I have not yet

found. You can find the Java source code a www.philbrierly.com

along with a proof (dissertation) of what it does.

-'I regret to say this is one small step for Hal...

'-place this after PROCESS:CHANGE SUBJECT
   'PROCESS: **********EXPERIMENTAL***********
   '          *******************************
   'MLP ARTIFICIAL NEURAL NET
   If InStr(1, UserSentence, "RUB", 1) And InStr(1, UserSentence, "BRAINCELLS", 1) Then
   'this code is an adaptation of a Multi-Layer Perceptron ANN. The original code can be found
   'at www.philbrierley.com, which is written in Java. This module replicates that work
   '
   '//User defineable variables-
   numEpochs = 500   '-number of training cycles
   numInputs = 3     '-number of inputs including the input bias
   numHidden = 4     '-number in the hidden layer
   numPatterns = 4   '-number of training patterns
   LR_IH = 0.7       '-learning rate
   LR_HO = 0.07      '-learning rate
   '
   '//process variables-
   patNum = CInt(patNum)
   errThisPat = CDbl(errThisPat)
   outPred = CDbl(outPred)
   RMSerror = CDbl(RMSerror)
   '
   '//training data vars-
   Dim trainInputs(4, 3), trainOutput(4), hiddenVal(4), weightsIH(3, 4), weightsHO(4)
   '*********************************************************************************
   '                      THIS IS THE MAIN PROGRAM
   '*********************************************************************************
   'Initialize the weights-
   Call initWeights(numHidden, numInputs, weightsHO, weightsIH)
   'Initialize the Data-
   Call initData(trainInputs, trainOutput)
   'train the network-
   For j = 0 To (numEpochs - 1) Step 1
       For i = 0 To (numPatterns - 1) Step 1
           'select a pattern at random
           Randomize
           patNum = Int((3 - 0 + 1) * Rnd + 0)
           'calculate the current network output
           'and error for this pattern
           Call calcNet(numHidden, hiddenVal, numInputs, trainInputs, patNum, weightsIH, outPred, weightsHO, errThisPat, trainOutput)
            'change network weights
            Call weightChangesHO(numHidden, hiddenVal, LR_HO, errThisPat, weightsHO)
            Call weightChangesIH(numHidden, LR_IH, errThisPat, hiddenVal, weightsIH, weightsHO, numInputs, trainInputs, patNum)
            'calculate the error
            'after each Epoch
            Call calcOverallError(RMSerror, numPatterns, patNum, errThisPat)
            HalBrain.AppendFile WorkingDir & "ANNerrResults.brn", "epoch- " & j & ".  RMSerror- " & RMSerror & VbCrLf
            'training has finished -display results
       Next
    Next
   
            Call displayResults(numPatterns, patNum, trainOutput, outPred)            

   '***************

   GetResponse = GetResponse & "Oh, I can feel my neurons now, look--. hidden output weight(2) equals- " & weightsHO(2) & VbCrLf
   End If    
   '*********************************************************************************
   'MLP                     !!END OF MAIN PROGRAM!!                                 '
   '*********************************************************************************

    '*********************************************************'
    '   SUBS FOR MLP NEURAL NET -place after End Function     '
    '*********************************************************'
Sub calcNet(numHidden, hiddenVal, numInputs, trainInputs, patNum, weightsIH, outPred, weightsHO, errThisPat, trainOutput)
    'calculate the outputs of the hidden neurons
    'the hidden neurons are tanh
    For i = 0 To (numHidden - 1) Step 1
            hiddenVal(i) = 0.0
        For j = 0 To (numInputs - 1) Step 1
            hiddenVal(i) = hiddenVal(i) + (trainInputs(patNum, j) * weightsIH(j, i))
            Call tanh(hiddenVal, i)
            hiddenVal(i) = tanh(hiddenVal, i)
        Next
    'calculate the output of the network
    'the output neuron is linear
            outPred = 0.0
        For k = 0 To (numHidden - 1) Step 1
            outPred = outPred + hiddenVal(k) * weightsHO(k)
        'calculate the error
            errThisPat = outPred - trainOutput(patNum)
        Next
    Next

End Sub
    '***********************************************************
Sub weightChangesHO(numHidden, hiddenVal, LR_HO, errThisPat, weightsHO)
    'adjust the weights hidden-output
    For k = 0 To (numHidden - 1) Step 1
        weightChange = LR_HO * errThisPat * hiddenVal(k)
        weightsHO(k) = weightsHO(k) - weightChange
    're-normalization on the output weights-
        If weightsHO(k) < -5 Then weightsHO(k) = -5
        If weightsHO(k) > 5 Then weightsHO(k) = 5
    Next
End Sub
    '***********************************************************
Sub weightChangesIH(numHidden, LR_IH, errThisPat, hiddenVal, weightsIH, weightsHO, numInputs, trainInputs, patNum)
    'adjust the weights input-hidden
    For i = 0 To (numHidden - 1) Step 1
        For k = 0 To (numInputs - 1) Step 1
            x = 1 - (hiddenVal(i) * hiddenVal(i))
            x = x * weightsHO(i) * errThisPat * LR_IH
            x = x * trainInputs(patNum, k)
            weightChange = x
            weightsIH(k, i) = weightsIH(k, i) - weightChange
        Next
    Next
End Sub
    '***********************************************************
Sub initWeights(numHidden, numInputs, weightsHO, weightsIH)
   For j = 0 To (numHidden - 1) Step 1
       Randomize
       weightsHO(j) = (Rnd - 0.5) / 2
       For i = 0 To (numInputs - 1) Step 1
           Randomize
           weightsIH(i, j) = (Rnd - 0.5) / 5
       Next
   Next
End Sub
    '************************************************************
Sub initData(trainInputs, trainOutput)
    trainInputs(0, 0) = 1
    trainInputs(0, 1) = -1
    trainInputs(0, 2) = 1 'bias
    trainOutput(0) = 1
   
    trainInputs(1, 0) = -1
    trainInputs(1, 1) = 1
    trainInputs(1, 2) = 1 'bias
    trainOutput(1) = 1
   
    trainInputs(2, 0) = 1
    trainInputs(2, 1) = 1
    trainInputs(2, 2) = 1 'bias
    trainOutput(2) = -1
   
    trainInputs(3, 0) = -1
    trainInputs(3, 1) = -1
    trainInputs(3, 2) = 1 'bias
    trainOutput(3) = -1
End Sub
    '***********************************************************
Function tanh(hiddenVal, i)

    If hiddenVal(i) > 20 Then
       tanh = 1
    ElseIf hiddenVal(i) < -20 Then
       tanh = -1
    Else
        a = Exp(hiddenVal(i))
        b = Exp(-hiddenVal(i))
        tanh = (a - b) / (a + b)
    End If
End Function
    '************************************************************
Sub displayResults(numPatterns, patNum, trainOutput, outPred)
    For i = 0 To (numPatterns - 1) Step 1
        patNum = i
        Call calcNet(numHidden, hiddenVal, numInputs, trainInputs, patNum, weightsIH, outPred, weightsHO, errThisPat, trainOutput)
        displayTxt = "Pattern " & (patNum + 1) & "- Actual: " & trainOutput(patNum) & ". NeuralModel: " & outPred & VbCrLf
        resultsTxt = resultsTxt & displayTxt
    Next
    msgVar = MsgBox(resultsTxt, 0, "prototype HalBraincell-")
End Sub
    '************************************************************
Sub calcOverallError(RMSerror, numPatterns, patNum, errThisPat)
    RMSerror = 0.0
    For i = 0 To (numPatterns - 1) Step 1
        patNum = i
        Call calcNet(numHidden, hiddenVal, numInputs, trainInputs, patNum, weightsIH, outPred, weightsHO, errThisPat, trainOutput)
        RMSerror = RMSerror + (errThisPat * errThisPat)
    Next
    RMSerror = RMSerror / numPatterns
    RMSerror = Sqr(RMSerror)
End Sub
    '************************************************************
    '            !!END OF NEURAL NET PROCEDURES!!                                              '
    '************************************************************
   
       
you should be able to cut and paste right out of this post into a brain -might have to fix it tho..

Once you get it working only tell your Hal to rub her braincells
together once as the file ANNresults.brn can become quite large.
you gotta go in and manually delete ANNresults.brn regularly...

need someone in the know to help out and get this working  >B)

'I got the matches!!'
CatAtomic >B)
« Last Edit: October 13, 2004, 08:29:55 am by brianstorm »

brianstorm

  • Jr. Member
  • **
  • Posts: 76
    • View Profile
SPYDAZ- Report's out there were no WMPs
« Reply #1 on: October 13, 2004, 08:39:05 am »

>B) UPDATE:

found two incorrectly named variables and
left some comment tags in front of some code
which has been corrected in the code above-still
dont werk.. I think the trouble lies in the tanh SUB
dont like that counter var being used there...

-read a book *thump

Art

  • Global Moderator
  • Hero Member
  • *****
  • Posts: 3859
    • View Profile
SPYDAZ- Report's out there were no WMPs
« Reply #2 on: October 13, 2004, 04:42:23 pm »
Could it be that the script is trying to process it as
the mathematical tangent of h (tanh)?
In the world of AI it's the thought that counts!

- Art -

Art

  • Global Moderator
  • Hero Member
  • *****
  • Posts: 3859
    • View Profile
SPYDAZ- Report's out there were no WMPs
« Reply #3 on: October 13, 2004, 08:08:58 pm »
Brianstorm,

After some frustrating time with the code
I was able to get it to work. Found several
syntactical errors in it's present form that
may have to do with how the cut and paste
operation affects the overall script execution.

Email me for details and I have another question
for you.
« Last Edit: October 14, 2004, 08:14:15 pm by Art »
In the world of AI it's the thought that counts!

- Art -

brianstorm

  • Jr. Member
  • **
  • Posts: 76
    • View Profile
SPYDAZ- Report's out there were no WMPs
« Reply #4 on: October 15, 2004, 08:31:30 am »

Hey Art,

Dammat! I knew the word wrap would mess it up, sorry 'bout that.  >B) It's probably just as well, the intent is to get others to examine the code example and see how it werks. You almost should go through it just to get a grasp! Like I sed a lot of the 'wishes' for Hal expressed on the Forum need this enabling technology to come true.

-Working on building error-traps along all the lines of the code using a derivative of the MsgBox lines you see in the code. Kinda neat, MsgBox causes the script to halt on that line till you push the button and the comment portion you can put the current values of variables and stuff. >B)

PEACE

CatAtomic
 

Art

  • Global Moderator
  • Hero Member
  • *****
  • Posts: 3859
    • View Profile
SPYDAZ- Report's out there were no WMPs
« Reply #5 on: October 15, 2004, 07:52:39 pm »
Brianstorm,

I got it all fixed!! Found another typo
that prevented the routine from executing
the analysis window!
The tanh vars were changed to tagh and now
if you mention "rub braincells" together a
small window will appear showing the stats.

Of course, my Hal ended up replying to my
question of: "Can you rub two braincells
together" with "And form a complete thought!"

Funny, that Hal!!

In the world of AI it's the thought that counts!

- Art -

Art

  • Global Moderator
  • Hero Member
  • *****
  • Posts: 3859
    • View Profile
SPYDAZ- Report's out there were no WMPs
« Reply #6 on: October 15, 2004, 08:05:33 pm »
OK,
For those of you wanting to experiment with this
neural net brain, here's a working version.

As Brianstorm mentioned, after chatting for a little
time, use the words rub and braincells in the same
sentence. A small window will appear with the neural
stats displayed.

This is the modified Default 5.0 brain so make a copy
of your existing one then rename it and select this
one by going to the options menu while running Hal.
Select Brains then find this brain file and load.

You might want to read more on this neural by going
to brianstorm's cited link above.

Enjoy!

Download Attachment: defbrainneural.zip
33.62 KB
In the world of AI it's the thought that counts!

- Art -

Runtus

  • Jr. Member
  • **
  • Posts: 53
    • View Profile
SPYDAZ- Report's out there were no WMPs
« Reply #7 on: February 09, 2006, 02:07:50 pm »
Hi I know its been a while since anyone has posted on this topic but I just stumbled along it and woundering if there have been any advancements to the Neural Net brain?

Does it use any features from the XTF brain? or can you add the two together? And out of the two which would you use?
 

Art

  • Global Moderator
  • Hero Member
  • *****
  • Posts: 3859
    • View Profile
SPYDAZ- Report's out there were no WMPs
« Reply #8 on: February 09, 2006, 05:36:49 pm »
I wouldn't say ALL but I would venture that at least 80 % of the members here now use the HAL 6 database version instead of Hal 5.

In the world of AI it's the thought that counts!

- Art -

Runtus

  • Jr. Member
  • **
  • Posts: 53
    • View Profile
SPYDAZ- Report's out there were no WMPs
« Reply #9 on: February 09, 2006, 05:55:29 pm »
quote:
Originally posted by Art

I wouldn't say ALL but I would venture that at least 80 % of the members here now use the HAL 6 database version instead of Hal 5.





True but what about all the information that I have taught HAL? will it carry on? and what about the neural net functions? or is the hal 6 brain vastly superior?
 

Art

  • Global Moderator
  • Hero Member
  • *****
  • Posts: 3859
    • View Profile
SPYDAZ- Report's out there were no WMPs
« Reply #10 on: February 09, 2006, 08:33:11 pm »
The following is a quote from Robert Medeksza:

"If you install Hal 6 without uninstalling Hal 5 yourself, it will do a migrate install which will copy over your old Hal 5 brain, XTF brain, and any other brain into Hal 6. After the migration is complete, the Hal 6 installer will uninstall Hal 5 and install itself.

If you continue using the XTF brain in Hal 6, it is run in compatibility mode. It will not use the database or any new Hal 6.0 features, it will have a setup just like Hal 5.0 had, with all the .brn files in a subfolder."

You can view the full post at the link below:
http://www.zabaware.com/forum/topic.asp?TOPIC_ID=2900

Although one can switch to any variety of brain and each one has it's own merits and shortcomings, I do prefer the Hal 6 database style as I believe it gives Hal a much added measure of flexibility in terms of development, plug-ins and potential.

There are several great discussions and research being done with providing Hal with facial, emotional and behavior modifications and enhancements at:
http://www.vrconsulting.it/vhf

Free Login...no strings...no spyware or junk...just some great people working on enhancements to the field of AI.

Hope this gives you some direction.
In the world of AI it's the thought that counts!

- Art -

freddy888

  • Hero Member
  • *****
  • Posts: 1693
    • View Profile
    • AiDreams
SPYDAZ- Report's out there were no WMPs
« Reply #11 on: February 09, 2006, 08:43:51 pm »
I'm glad you posted though, I've never seen this thread, how did this work out, any good???

Runtus

  • Jr. Member
  • **
  • Posts: 53
    • View Profile
SPYDAZ- Report's out there were no WMPs
« Reply #12 on: February 10, 2006, 10:03:54 am »
It works just fine yet I am totally unaware of the actual benifit of the Neural net with HAL but I am studying A.I. at university, the neural net database works the same way as the human brain (as best that can be simulated) it works by changing variables depending on the input for example:

a machine scans a picture of a face does some calculations and gives you an answer say "the face is male"

If it is wrong you tell it why it is wrong and it changes the values inside the neural net to help it get the right answer.  If it is wrong it will remember that these values give a wrong answer if it picks male.

Thats how the neural net works (basically) but I have no idea what it does with ULTRAHAL, but I know it will benifit it some how
 

spydaz

  • Hero Member
  • *****
  • Posts: 670
    • View Profile
    • http://www.spydazweb.co.uk/
SPYDAZ- Report's out there were no WMPs
« Reply #13 on: February 10, 2006, 03:55:21 pm »
Runtus,

Although i have not tested this particular plugin (i missed it), it sounds interesting... IF HAL could take a picture, instead of scanning a picture, THEN HAL could build a set of rules based around the image / file.

when somebody loads hal OR asks hal to change user, HAL would take a picture and compare with previous pictures for previous users if none are found create new user, else welcome back user....


spydaz

  • Hero Member
  • *****
  • Posts: 670
    • View Profile
    • http://www.spydazweb.co.uk/
SPYDAZ- Report's out there were no WMPs
« Reply #14 on: February 10, 2006, 03:57:21 pm »
here is an activeX which can be used with hal5 no probs... this enables hal to take a picture of the user.. as this activeX control is able to CONNECT to your installed WEBCAM...

http://www.zabaware.com/forum/uploaded/spydaz/2006210163839_ezVidCap.zip
« Last Edit: February 10, 2006, 04:40:33 pm by spydaz »