Imporvement ideas for smart convolution assignment wizard, and a flexible advance data batch teach step method
-
- Posts: 3
- Joined: Sun 21. May 2023, 02:42
Imporvement ideas for smart convolution assignment wizard, and a flexible advance data batch teach step method
Hello. I am impressed with MemBrain-NN. Its my favorite neural network designer/editor/simulator tool out there. I have a few request that would improve the user-experience and capabilities of the app, is already broadly and deeply versatile as it stands.
Suggestion 1>:
Can you improve the convolution feature, in a way such that is allows for any paired-up sets of networks that are identical in structure layout design (lets so network A and B are identical each having 5 input 9 hidden L1, 7hidden L2, 10 output. the idea is that it identical structure), can be one-to-one correspondingly paired as co-convolution nets according to their counterparts. So this describes a smart wizard-like program to dynamically and skillfully assign convolution matching between individual nodes and links between two massive networks that are identical in structure.
Suggestion 2>:
can you Improve the sample batch processing feature to have more flexibilities like: able to do micro-batching or gradient accumulation, other than only doing online_learning vs glabal-batching_all_sample_of_the_training_set. let me elaborate. Recurrent Neural Network with backpropagation through time will work differently with micro-batching over the sequence length. Remember how normally when doing "Batch Learning" (according to your option in the edit teacher section, I can also call it "global-batching") over an feedforwardnet with a loopback (therefore an basic RNN), it will treat all the data points as on whole sequence, backpropagating through time on all time steps. Well whit "Micro-batching" I intend to use that to do the teach-step (backpropagation through time) on smaller set with the whole one larger main set in one go. let say mini sequences in a larger sequence if you get what I am struggling to find words to describe. Also include a supporting option to reset the net after each mini sequence whiting the larger sequence. also include parameter for count size of the mini sequences. as an analogy, imagine its like every fixed size of 5 or every 9 or x length amount of words over the whole chapter in a literature used to train a net.
Those would be awesome to complete a project of mine for computer vision. I am trying a new approach almost like a new architecture is its correct to say so even. Thanks for hearing me out if you did. all the best its been a pleasure!
<<<<<<<<<<POST SCRIPT: as of the 29/DEC/2024>>>>>>>>>>
I want to distinguish that when talking about "microbatching" and "gradient accumulation" "etc." even if so happen to be misusing the terms. I said those thing to mean something that not the same as doing socalled "minibatching". How I understand is that "minibacthing" is just a method of deciding that instead of a Global-batch which is heavy on the processing hardware's, it is divided into separated smaller batched each trained separately (which if done strategically, like by using stochastic methods can have advantages). but that not what I am taking about. I am talking about instead a method of doing gradient decent firstly on small set fixed sets of sample (in this case for timeseries data) and at the same time then later combining via summation all the gradients of each small fixed set into one larger gradient just to finally do a single backpropagation training step on it all in one go. I am unsure what to call it correctly but not to be confused with typical "minibacthing" though. I just put that elaboration out there incase needed, since maybe I never explained well enough. thanks again.
Suggestion 1>:
Can you improve the convolution feature, in a way such that is allows for any paired-up sets of networks that are identical in structure layout design (lets so network A and B are identical each having 5 input 9 hidden L1, 7hidden L2, 10 output. the idea is that it identical structure), can be one-to-one correspondingly paired as co-convolution nets according to their counterparts. So this describes a smart wizard-like program to dynamically and skillfully assign convolution matching between individual nodes and links between two massive networks that are identical in structure.
Suggestion 2>:
can you Improve the sample batch processing feature to have more flexibilities like: able to do micro-batching or gradient accumulation, other than only doing online_learning vs glabal-batching_all_sample_of_the_training_set. let me elaborate. Recurrent Neural Network with backpropagation through time will work differently with micro-batching over the sequence length. Remember how normally when doing "Batch Learning" (according to your option in the edit teacher section, I can also call it "global-batching") over an feedforwardnet with a loopback (therefore an basic RNN), it will treat all the data points as on whole sequence, backpropagating through time on all time steps. Well whit "Micro-batching" I intend to use that to do the teach-step (backpropagation through time) on smaller set with the whole one larger main set in one go. let say mini sequences in a larger sequence if you get what I am struggling to find words to describe. Also include a supporting option to reset the net after each mini sequence whiting the larger sequence. also include parameter for count size of the mini sequences. as an analogy, imagine its like every fixed size of 5 or every 9 or x length amount of words over the whole chapter in a literature used to train a net.
Those would be awesome to complete a project of mine for computer vision. I am trying a new approach almost like a new architecture is its correct to say so even. Thanks for hearing me out if you did. all the best its been a pleasure!
<<<<<<<<<<POST SCRIPT: as of the 29/DEC/2024>>>>>>>>>>
I want to distinguish that when talking about "microbatching" and "gradient accumulation" "etc." even if so happen to be misusing the terms. I said those thing to mean something that not the same as doing socalled "minibatching". How I understand is that "minibacthing" is just a method of deciding that instead of a Global-batch which is heavy on the processing hardware's, it is divided into separated smaller batched each trained separately (which if done strategically, like by using stochastic methods can have advantages). but that not what I am taking about. I am talking about instead a method of doing gradient decent firstly on small set fixed sets of sample (in this case for timeseries data) and at the same time then later combining via summation all the gradients of each small fixed set into one larger gradient just to finally do a single backpropagation training step on it all in one go. I am unsure what to call it correctly but not to be confused with typical "minibacthing" though. I just put that elaboration out there incase needed, since maybe I never explained well enough. thanks again.
Last edited by learner411 on Sun 29. Dec 2024, 15:14, edited 1 time in total.
Re: Imporvement ideas for smart convolution assignment wizard, and a flexible advance data batch teach step method
Hello, and many thanks for the positive feedback and the great suggestions!
Both of them seem to be interesting and sound improvements. Not quite small ones, certainly, I will have to see what is possible and in which time frame.
With respect to the 1st suggestion: You mean a kind of "wizard" that allows to establish convolutions between the corresponding elements of two identically structured networks, right? E.g. a convolution between the threshold of neuron 1 in layer 1 of network A and the threshold of neuron 1 in layer 1 of network B. Similarly between all corresponding link weights in the two networks?
If I get your idea right, then what options should such a wizard provide? It could take the two networks as input where each of them could be contained in a group element for identification/definition. That's the simple part, I guess. So what should be the exact feature set in terms of convolution options?
And probably it should be able to take more than just two nets as input, right? It could be interesting or important to be able to apply convolutions between an arbitrary number of identically shaped nets, right?
Thanks and best regards
Both of them seem to be interesting and sound improvements. Not quite small ones, certainly, I will have to see what is possible and in which time frame.
With respect to the 1st suggestion: You mean a kind of "wizard" that allows to establish convolutions between the corresponding elements of two identically structured networks, right? E.g. a convolution between the threshold of neuron 1 in layer 1 of network A and the threshold of neuron 1 in layer 1 of network B. Similarly between all corresponding link weights in the two networks?
If I get your idea right, then what options should such a wizard provide? It could take the two networks as input where each of them could be contained in a group element for identification/definition. That's the simple part, I guess. So what should be the exact feature set in terms of convolution options?
And probably it should be able to take more than just two nets as input, right? It could be interesting or important to be able to apply convolutions between an arbitrary number of identically shaped nets, right?
Thanks and best regards
Thomas Jetter
-
- Posts: 3
- Joined: Sun 21. May 2023, 02:42
Re: Imporvement ideas for smart convolution assignment wizard, and a flexible advance data batch teach step method
Hello again and many thanks for your caring and attentive reply Thomas Jetter (If I may address you this way). Your raised some good questions and reasonably pointed out some challenges that the request would face in order to manifest properly. I would like to start by addressing some of your questions. When you asked:
<<[ "With respect to the 1st suggestion: You mean a kind of "wizard" that allows to establish convolutions between the corresponding elements of two identically structured networks, right?" ]>>
To that I say Yes I do mean anything wizard-like (and not limited to that, could be a regular one click function or command etc.) but only where you judgment would degerming if an actual literal wizard or something else if more suited.
<<[ "E.g. a convolution between the threshold of neuron 1 in layer 1 of network A and the threshold of neuron 1 in layer 1 of network B. Similarly between all corresponding link weights in the two networks?" ]>>
Absolutely your spot on, you do get what I meant. I appreciate that.
<<[ "If I get your idea right, then what options should such a wizard provide?" ]>>
Well to be honest in hindsight after being presented with that question, that is now why I am especially reconsidering maybe some on click command button or something of the sort. because as I imagine as things would play out should such a button exist, as the two set of candidate networks are pre-selected (for example one being normal selection and the other being extra-selection) only a click of a command button is needed to complete the task not wizard is necessary now I suppose. but that is for you to decide.
by the way I want to praise the extra-selection function you designed. brilliant! It show me how much you care about the Membrain project/work.
<<[ "It could be interesting or important to be able to apply convolutions between an arbitrary number of identically shaped nets, right?" ]>>
Yes to any given arbitrary number of identically shaped net indeed. that would be great part of the purpose and liberating too. It would be problematic to only be limited to a finite set of configuration presets for network shape criteria, being force required.
oh by the way please pay notice to a Post-script extension edit made to the original post I made above. thanks again.
<<[ "With respect to the 1st suggestion: You mean a kind of "wizard" that allows to establish convolutions between the corresponding elements of two identically structured networks, right?" ]>>
To that I say Yes I do mean anything wizard-like (and not limited to that, could be a regular one click function or command etc.) but only where you judgment would degerming if an actual literal wizard or something else if more suited.
<<[ "E.g. a convolution between the threshold of neuron 1 in layer 1 of network A and the threshold of neuron 1 in layer 1 of network B. Similarly between all corresponding link weights in the two networks?" ]>>
Absolutely your spot on, you do get what I meant. I appreciate that.
<<[ "If I get your idea right, then what options should such a wizard provide?" ]>>
Well to be honest in hindsight after being presented with that question, that is now why I am especially reconsidering maybe some on click command button or something of the sort. because as I imagine as things would play out should such a button exist, as the two set of candidate networks are pre-selected (for example one being normal selection and the other being extra-selection) only a click of a command button is needed to complete the task not wizard is necessary now I suppose. but that is for you to decide.
by the way I want to praise the extra-selection function you designed. brilliant! It show me how much you care about the Membrain project/work.
<<[ "It could be interesting or important to be able to apply convolutions between an arbitrary number of identically shaped nets, right?" ]>>
Yes to any given arbitrary number of identically shaped net indeed. that would be great part of the purpose and liberating too. It would be problematic to only be limited to a finite set of configuration presets for network shape criteria, being force required.
oh by the way please pay notice to a Post-script extension edit made to the original post I made above. thanks again.
Re: Imporvement ideas for smart convolution assignment wizard, and a flexible advance data batch teach step method
Hello again, and many thanks for the detailed additional feedback. I think we are on the same page so far.
I implemented a first solution for Suggestion #1. I haven't released it so far, however, because I want to do more testing and also need to adapt the help file first. The solution is now based on the already existing "Group Relation" feature in MemBrain: Sub Nets can already be defined in MemBrain by using "Groups" and then "Group Relations". If you don't know about the feature yet, please take a look into the help file in section "Using Group Relations to work with Sub Nets".
If there are more than two Group Relations (i.e. sub nets) defined in the loaded net, then there will be a new menu option (Edit-->Convolutions...). A dialog comes up that allows to select any number of the defined Sub Nets and to define for each available layer in the nets what convolutions shall be added (neuron thresholds, outgoing links, incoming links).
I will post here again once the release is online.
Kind regards,
Thomas
I implemented a first solution for Suggestion #1. I haven't released it so far, however, because I want to do more testing and also need to adapt the help file first. The solution is now based on the already existing "Group Relation" feature in MemBrain: Sub Nets can already be defined in MemBrain by using "Groups" and then "Group Relations". If you don't know about the feature yet, please take a look into the help file in section "Using Group Relations to work with Sub Nets".
If there are more than two Group Relations (i.e. sub nets) defined in the loaded net, then there will be a new menu option (Edit-->Convolutions...). A dialog comes up that allows to select any number of the defined Sub Nets and to define for each available layer in the nets what convolutions shall be added (neuron thresholds, outgoing links, incoming links).
I will post here again once the release is online.
Kind regards,
Thomas
Thomas Jetter
Re: Imporvement ideas for smart convolution assignment wizard, and a flexible advance data batch teach step method
Hello again,
I now released the new version 15.02.00.00. Please have a try on the new feature (#1 above) and provide feedback if you like. Would be highly appreciated!
Thanks and regards
I now released the new version 15.02.00.00. Please have a try on the new feature (#1 above) and provide feedback if you like. Would be highly appreciated!
Thanks and regards
Thomas Jetter
Re: Imporvement ideas for smart convolution assignment wizard, and a flexible advance data batch teach step method
15.02.01.00 is now online which includes some improvements and fixes.
Thomas Jetter
-
- Posts: 3
- Joined: Sun 21. May 2023, 02:42
Re: Imporvement ideas for smart convolution assignment wizard, and a flexible advance data batch teach step method
Hello again. much thanks an gratitude goes out to your effort and invested interest, of which nothing less is expected of you in that regard.
I am glad to see that the convolution feature is implemented. I must say that I am having difficulty successfully getting it to work correctly. As I believe, in my efforts to use it it still keeps failing to even establish any convolution at all. I have always had a poor grasp on the group relation. I feel maybe that if I had some examples and simplification and expounded elaborative style explanation of what is even happening in group relation at a conceptual and intuitive level, that would help me nail it. I always sense that the group relations had powerful capabilities when added to the arsenals of features in membrain-nn. I almost completely feel it was fitting for you to use that features to carry out the adaptation of network set-to-set-ordered convolution, well played. but could you give some special extended lessons on group relation on a whole. for my understanding sake personally, I find that help file's partake on that topic way to concise and cryptic, its so esoteric (for those who have prerequisite knowledge, it would want to seem).
Can't thanks you enough. Thanks and best regards too Thomas Jetter.
I am glad to see that the convolution feature is implemented. I must say that I am having difficulty successfully getting it to work correctly. As I believe, in my efforts to use it it still keeps failing to even establish any convolution at all. I have always had a poor grasp on the group relation. I feel maybe that if I had some examples and simplification and expounded elaborative style explanation of what is even happening in group relation at a conceptual and intuitive level, that would help me nail it. I always sense that the group relations had powerful capabilities when added to the arsenals of features in membrain-nn. I almost completely feel it was fitting for you to use that features to carry out the adaptation of network set-to-set-ordered convolution, well played. but could you give some special extended lessons on group relation on a whole. for my understanding sake personally, I find that help file's partake on that topic way to concise and cryptic, its so esoteric (for those who have prerequisite knowledge, it would want to seem).
Can't thanks you enough. Thanks and best regards too Thomas Jetter.
Re: Imporvement ideas for smart convolution assignment wizard, and a flexible advance data batch teach step method
There's probably not much information required to get you going on group relations. May be just follow the steps below to grasp the idea:
1.) Create a typical net in MemBrain which matches your idea of a "sub net": E.g. some input neurons, some hidden neurons (may be in multiple layers) and some output neurons.
2.) Select all neurons of your net's first layer (in the above example this is the input layer) and group them (context menu on one of the selected neurons --> "Group element(s)".
3.) Double-click the new group and give it a name you will remember (e.g. "inputs").
4.) Do the same with all neurons of your net's last layer (in the example, this would be all output neurons). Name this new group again, e.g. "outputs"
5.) Double-click on the "inputs" group again. In the properties dialog of the group you will see an area called "Relations to other groups". Click "New Relation" here. Another dialog will come up.
6.) Select relation type "GENERIC SUB NET" in the pull down. Select the group "outputs" (or whatever you named it) as "Target Group". In the edit field "Relation Name" provide a name you will remember. This name represents your sub net. I.e. call it "sub net 1". You can also provide a comment to this group relation (or sub net if you like). Click on OK in both dialogs.
7.) Select the whole sub net and copy/paste it in order to create another sub net, including the group relation which is copied, too.
8.) Edit the starting (i.e. input) group of the just copied net (e.g. double-click on the group if it is collapsed or right-click on one of its neurons and select "Edit owning group" if it is uncollapsed). Select the copied relation in the group properties dialog, press "Edit Relation" and give it a different name (e.g. "sub net 2"). Press OK on both dialogs
--> You now have two sub nets defined, named "sub net 1" and "sub net 2". Check out the drop-down list in MemBrain's menu bar: You will see there the three options "Full Net", "sub net 1" and "sub net 2". Select one of the sub nets and you will see that the scope (here: for training/teaching) is set to the corresponding sub net.
--> You will also see that the menu function "Edit/Concolutions.../Establish Sub Net Convolutions..." is now enabled (there's also a corresponding toolbar icon for this)
9.) Click on that menu option (or the toolbar button). It will bring up a dialog where you can select the two sub nets (use Ctrl+Left mouse button for multi select) and the convolutions per layer to be established. When clicking OK a message box will tell you how many convolutions were added.
10.) Select one of the neurons or the links in once of the sub nets, right click and select "Select Convolutions...". You will see that the corresponding convolution partners get selected.
Please provide feedback if this works!
Note: There has been an update published recently with some bug fixes, the latest MemBrain version is 15.02.01.00. Ensure that you have it installed. Best is to enable automatic check for updates in MemBrain's help menu!
Best regards
1.) Create a typical net in MemBrain which matches your idea of a "sub net": E.g. some input neurons, some hidden neurons (may be in multiple layers) and some output neurons.
2.) Select all neurons of your net's first layer (in the above example this is the input layer) and group them (context menu on one of the selected neurons --> "Group element(s)".
3.) Double-click the new group and give it a name you will remember (e.g. "inputs").
4.) Do the same with all neurons of your net's last layer (in the example, this would be all output neurons). Name this new group again, e.g. "outputs"
5.) Double-click on the "inputs" group again. In the properties dialog of the group you will see an area called "Relations to other groups". Click "New Relation" here. Another dialog will come up.
6.) Select relation type "GENERIC SUB NET" in the pull down. Select the group "outputs" (or whatever you named it) as "Target Group". In the edit field "Relation Name" provide a name you will remember. This name represents your sub net. I.e. call it "sub net 1". You can also provide a comment to this group relation (or sub net if you like). Click on OK in both dialogs.
7.) Select the whole sub net and copy/paste it in order to create another sub net, including the group relation which is copied, too.
8.) Edit the starting (i.e. input) group of the just copied net (e.g. double-click on the group if it is collapsed or right-click on one of its neurons and select "Edit owning group" if it is uncollapsed). Select the copied relation in the group properties dialog, press "Edit Relation" and give it a different name (e.g. "sub net 2"). Press OK on both dialogs
--> You now have two sub nets defined, named "sub net 1" and "sub net 2". Check out the drop-down list in MemBrain's menu bar: You will see there the three options "Full Net", "sub net 1" and "sub net 2". Select one of the sub nets and you will see that the scope (here: for training/teaching) is set to the corresponding sub net.
--> You will also see that the menu function "Edit/Concolutions.../Establish Sub Net Convolutions..." is now enabled (there's also a corresponding toolbar icon for this)
9.) Click on that menu option (or the toolbar button). It will bring up a dialog where you can select the two sub nets (use Ctrl+Left mouse button for multi select) and the convolutions per layer to be established. When clicking OK a message box will tell you how many convolutions were added.
10.) Select one of the neurons or the links in once of the sub nets, right click and select "Select Convolutions...". You will see that the corresponding convolution partners get selected.
Please provide feedback if this works!
Note: There has been an update published recently with some bug fixes, the latest MemBrain version is 15.02.01.00. Ensure that you have it installed. Best is to enable automatic check for updates in MemBrain's help menu!
Best regards
Thomas Jetter