Then this is the right place for your contribution!
Just an idea how to help reduce hand-work time needed to find an optimal neural network architecture.
A user specifies the limits on the number of neurons in the hidden layer, e.g., 2-20. Then, the user specifies the limit on the number of hidden layers, e.g., 1-3. He also selects from the list of functions on the neurons to use which functions will be applied. Also the limit on the total number of trials that the program will run.
The program then generates various nets and trains them till some minimum error on the test (or train) sample is achieved. The best - trained - architecture is stored in the end of the procedure.
This idea is derived from my experience in using other NN packages. Seems helpful and time saving.
With the current MemBrain version you could still implement some approach like this on your own, using the scripting language of MemBrain.
I'll also consider to add this as an example script. This way it would have the flexibility to be expanded by the users.