So it is rather scaling.
Thank you for your prompt answer
Search found 14 matches
- Tue 9. Apr 2013, 13:46
- Forum: All about using MemBrain
- Topic: Normalization
- Replies: 2
- Views: 22030
- Sun 7. Apr 2013, 19:09
- Forum: All about using MemBrain
- Topic: Normalization
- Replies: 2
- Views: 22030
Normalization
I'd like to ask about the feature in Membrain called normalization.
Is it normalization or rather scaling ?
Over there exist many names, best answer would be its formula.
Is it normalization or rather scaling ?
Over there exist many names, best answer would be its formula.
- Sat 2. Feb 2013, 10:02
- Forum: All about using MemBrain
- Topic: Think on Lesson
- Replies: 8
- Views: 46725
Re: Think on Lesson
Thank you for your answer, it is clear now.
- Thu 31. Jan 2013, 10:18
- Forum: All about using MemBrain
- Topic: Think on Lesson
- Replies: 8
- Views: 46725
Re: Think on Lesson
What I meant was that there is a difference beetwen data when teaching is finished and then you perform ThinkOnLesson.
See attached file
See attached file
- Wed 30. Jan 2013, 16:07
- Forum: All about using MemBrain
- Topic: Think on Lesson
- Replies: 8
- Views: 46725
Re: Think on Lesson
But in case of time invariant networks with no loops, input -> hidden layer 1 -> hidden layer 2 -> output,
results should be exactly the same ?
After teaching (with reseting network after each lesson) when you ResetNet and ThinkOnLesson you don't get the same results.
I checked also on Mackey Glass ...
results should be exactly the same ?
After teaching (with reseting network after each lesson) when you ResetNet and ThinkOnLesson you don't get the same results.
I checked also on Mackey Glass ...
- Sun 27. Jan 2013, 19:15
- Forum: All about using MemBrain
- Topic: Think on Lesson
- Replies: 8
- Views: 46725
Re: Think on Lesson
It is time variant network,
so to obtain the same result I would have to do ThinkStep as many times as there are delay steps ?
edit: I just realized it would not work
During thinking the net remembers its last state,
state after the last pattern has been applied during teaching.
So how to ...
so to obtain the same result I would have to do ThinkStep as many times as there are delay steps ?
edit: I just realized it would not work
During thinking the net remembers its last state,
state after the last pattern has been applied during teaching.
So how to ...
- Sun 27. Jan 2013, 15:17
- Forum: All about using MemBrain
- Topic: Think on Lesson
- Replies: 8
- Views: 46725
Think on Lesson
I have across a problem with thinking feature, when network is trained when I choose think from lesson editor
or evaluate network error strange thing happens - results change (on the same data).
I have checked and it happens in examples attached like Mackey series.
Why is that ?
Should it be ...
or evaluate network error strange thing happens - results change (on the same data).
I have checked and it happens in examples attached like Mackey series.
Why is that ?
Should it be ...
- Wed 12. Dec 2012, 20:18
- Forum: Project Support
- Topic: Neuron Output = Input
- Replies: 6
- Views: 34149
Re: Neuron Output = Input
Below I pasted part of the code I wrote,
could you have a look and tell me if it is ok ?
How the numbering of neurons is done ?
0 is the first neuron in Lesson Editor ?
Edit: changed the code, now seems to work,
but the result I get are not exactly the same,
there is a small difference in ...
could you have a look and tell me if it is ok ?
How the numbering of neurons is done ?
0 is the first neuron in Lesson Editor ?
Edit: changed the code, now seems to work,
but the result I get are not exactly the same,
there is a small difference in ...
- Tue 11. Dec 2012, 12:00
- Forum: Project Support
- Topic: Neuron Output = Input
- Replies: 6
- Views: 34149
Re: Neuron Output = Input
You don't have to perform this manually: MemBrain supports the so called 'Normalization' setting for input and output neurons. There you can specify a user defined value range for each neuron which is then automatically mapped by MemBrain to the applicable internal range of the neuron according to ...
- Mon 10. Dec 2012, 12:42
- Forum: Project Support
- Topic: Neuron Output = Input
- Replies: 6
- Views: 34149
Re: Neuron Output = Input
Thank you Thomas for the replay, also in the other post.
I am aware of IDENTICAl activation function but it works only for inputs from -1 to 1 so it clips below and above values.
I can work around by dividing inputs so they are in the range -1 .. 1.
Now I have a trained net and I want it to employ ...
I am aware of IDENTICAl activation function but it works only for inputs from -1 to 1 so it clips below and above values.
I can work around by dividing inputs so they are in the range -1 .. 1.
Now I have a trained net and I want it to employ ...