Difference in accuracy in FC-DNN on different run
up vote
0
down vote
favorite
I have been facing an issue of instability in deep neural networks. For example, while using DNN with fully connected layers (3 layers, 128 memory units in each), I am having a range of unweighted accuracy of 62% to 66% in the different runs (yes, the same train and test set). I used Xavier initializer and bias initializer as zero. All the other parameters are also fixed.
Have anyone faced such problems? I have tried to use fixed seed too, but it doesn't seem to help. Kinda puzzled about what is creating this instability.
Thanks in advance :)
neural-network deep-learning
add a comment |
up vote
0
down vote
favorite
I have been facing an issue of instability in deep neural networks. For example, while using DNN with fully connected layers (3 layers, 128 memory units in each), I am having a range of unweighted accuracy of 62% to 66% in the different runs (yes, the same train and test set). I used Xavier initializer and bias initializer as zero. All the other parameters are also fixed.
Have anyone faced such problems? I have tried to use fixed seed too, but it doesn't seem to help. Kinda puzzled about what is creating this instability.
Thanks in advance :)
neural-network deep-learning
This is completely normal, and its due to the random weight initialization.
– Matias Valdenegro
yesterday
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I have been facing an issue of instability in deep neural networks. For example, while using DNN with fully connected layers (3 layers, 128 memory units in each), I am having a range of unweighted accuracy of 62% to 66% in the different runs (yes, the same train and test set). I used Xavier initializer and bias initializer as zero. All the other parameters are also fixed.
Have anyone faced such problems? I have tried to use fixed seed too, but it doesn't seem to help. Kinda puzzled about what is creating this instability.
Thanks in advance :)
neural-network deep-learning
I have been facing an issue of instability in deep neural networks. For example, while using DNN with fully connected layers (3 layers, 128 memory units in each), I am having a range of unweighted accuracy of 62% to 66% in the different runs (yes, the same train and test set). I used Xavier initializer and bias initializer as zero. All the other parameters are also fixed.
Have anyone faced such problems? I have tried to use fixed seed too, but it doesn't seem to help. Kinda puzzled about what is creating this instability.
Thanks in advance :)
neural-network deep-learning
neural-network deep-learning
asked yesterday
Albert
1
1
This is completely normal, and its due to the random weight initialization.
– Matias Valdenegro
yesterday
add a comment |
This is completely normal, and its due to the random weight initialization.
– Matias Valdenegro
yesterday
This is completely normal, and its due to the random weight initialization.
– Matias Valdenegro
yesterday
This is completely normal, and its due to the random weight initialization.
– Matias Valdenegro
yesterday
add a comment |
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53237890%2fdifference-in-accuracy-in-fc-dnn-on-different-run%23new-answer', 'question_page');
}
);
Post as a guest
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
This is completely normal, and its due to the random weight initialization.
– Matias Valdenegro
yesterday