Page 1 of 1

How to save Best Network?

Posted: Mon 13. Sep 2010, 19:54
by MrCreosote
From what I can tell, the only user control is to specify an error level that when reached, training stops.

What would be the best way to do the following conditional stops or saves:
  • Save weights if a new minimum error achieved,
  • Save weights after M epochs have passed (incremental) or on Mth epoch (epoch number),
  • Stop training N epochs after minimum error achieved.
These could be based on Training or Validation errors - user specified.

The only way I know how to do this would be to write a script that would loop epochs where on each pass through the loop, Membrain would learn a single epoch. At the end of each epoch, error and number of epochs would be checked to determine if a save or stop was desired. I would be concerned that this script looping would slow down the training process significantly.

Any thoughts would be greatly appreciated,
Tom

Re: How to save Best Network?

Posted: Tue 14. Sep 2010, 06:19
by Admin
MrCreosote wrote:The only way I know how to do this would be to write a script that would loop epochs where on each pass through the loop, Membrain would learn a single epoch.
Rather than training single epochs I would recommend to train continuously and check against the number of epochs being elapsed within a script loop. That may lead to the script sometimes stopping a little too late (one or more epochs depending on the duration of an epoch). However, if the data is plausible and if the net architecture really is successfully managing the data then this should not matter. The same applies for the net error minimum stop condition.
MrCreosote wrote:I would be concerned that this script looping would slow down the training process significantly.

This is not an issue at all as long as you ensure that your script loop contains a 'Sleep' statement that ensures that the script will not eat up significant computing power. You could execute 'Sleep(100)' in the script loop whenever there is nothing to do, this is absolutely sufficient. Additionally, the script engine is executed in a separate thread in MemBrain. This means that if you have a multi core machine then the scripting is typically executed on a separate core on your machine and won't slow down MemBrain significantly even if you don't have the Sleep() command in place. Still, I strongly recommend to have it.
The scripting language is very fast in execution due to the fact that MemBrain contains a real script compiler that compiles the scripts to byte code before execution. I.e., it is not an interpreter. It won't influence your training speed noticeably.

Regards,
Thomas

Re: How to save Best Network?

Posted: Thu 16. Sep 2010, 22:25
by MrCreosote
...check against the number of epochs being elapsed...
I've been searching the script command library and haven't found what call brings back "number of epochs." (I've surely missed it, but I've already spent a couple hours trying to find it.)

I was thinking of making a complete list of the script commands which can be sorted - its hard to find all the scripts with the letters "GET" in the command.

Re: How to save Best Network?

Posted: Fri 17. Sep 2010, 06:33
by Admin
MrCreosote wrote:I've been searching the script command library and haven't found what call brings back "number of epochs."
uint GetLessonReps()

is what you have been looking for. The terminology 'epoche' is not used in MemBrain.

Note that there is also a command

void ResetLessonReps()

to reset the lesson repetitions (i.e. epoche) counter,
MrCreosote wrote:I was thinking of making a complete list of the script commands which can be sorted - its hard to find all the scripts with the letters "GET" in the command.
Just to ensure that you are not only searching the example scripts when looking for available commands: There is a complete (but, admittedly, not alphabetically sorted) list of commands in the MemBrain help:
Section 'Scripting' - 'Command Reference'

The specific commands given above for example are located in sub section 'Teaching'.

Regards,